Author 
Topic: HARD: Hamming Distance Questions (Read 35629 times) 

D
Guest


HARD: Hamming Distance Questions
« on: Jul 26^{th}, 2002, 3:09pm » 
Quote Modify
Remove

The wording of this problem seems confusing. I'm guessing I don't have the math background to solve it anyway. But it might be fun to think about. DO... you want a ?formula? that will generate 2 sets [x,y] with n elements of length k bitstrings such that for each relation (x,y) the hamming distance will be greater than c ?? Or maybe like, what are n, x, and y given k and c? OR: a formula that answers: A pair of sets with n elements of length k bitstrings, optimally chosen will have a minimum hamming distance of c? The first one seems much harder and more likely to be a programming question (fe: how do you do this in linear or polynomial time?) For the second though, you can start with some boundries. You know that with length k bitstrings, the maximum hamming distance between any two bitstrings is k. So if n = 1, the answer is k[i]. You also know that if [i]n = 2^k, the answer is 0. Hrm.. I don't have time to solve this at work right now, maybe I'll try and work on it later at home. This is kind of a fun problem. D


IP Logged 



william wu
wu::riddles Administrator
Gender:
Posts: 1291


Re: HARD: Hamming Distance Questions
« Reply #1 on: Jul 26^{th}, 2002, 4:09pm » 
Quote Modify

Sorry, I will repost a clearer wording of the problem later. It was not my intent to actually generate the elements of the optimal set  although that is also an interesting problem that we could explore. The latter question in your post is what I'm interested in. Assume you already have the optimally chosen set, which maximizes how spread out the elements are in Hamming space. Then what is the minimum Hamming distance D? Here is the notion of Hamming space, summarized in a useful picture: This is the Hamming space for k=3. Start at a vertex. Change one bit to cross one edge. Change two bits to cross two edges. Change three bits to cross three edges. Note the important difference between Hamming space and the everyday notion of numeric distance. Whereas we would normally say that 100 (4 in binary) and 000 (0 in binary) are 4 units apart, in Hamming space, they are only one unit apart, and very close. We can start with some easy cases. For instance, if n = 2^k, then D = 1. Also, if n = 2, then D = k. This is clearly illustrated by the picture above. Note that the points furthest apart in Hamming space are on opposite corners of the cube, and that those corners have exactly inverse polarities. Thanks!

« Last Edit: Jul 30^{th}, 2002, 5:57pm by william wu » 
IP Logged 
[ wu ] : http://wuriddles.com / http://forums.wuriddles.com



D
Guest


Re: HARD: Hamming Distance Questions
« Reply #2 on: Jul 26^{th}, 2002, 11:55pm » 
Quote Modify
Remove

DAMN!!! I had a huge writeup of my thoughts and it just got wiped by accidentally going back a page!! Grrr... Ok, firstly, I had 2 questions. 1: Can the sets overlap? It's really more of a boundry question. If they don't overlap, then the minimum distance is 1. 2: are the sets always of equal size? Meaning that if you add one element to set x, then you always add an element to set y?  Ok, here's some abstract thinking about the problem. If you start with n = 1, then the D = k. Considering the hamming distance as the length of a path from an element of x to an element of y. If you add one element to each set, even if they are of opposite polarity you must have decreased the distance of an original path by 1. However, I believe that although still need to prove that (assuming equal sized sets) you can add up to n/2 (round down) elements before the path decreases even more. The reason I say this is, lets say the sets weren't of equal size, you start adding elements to set y, the optimal elements to add first would be all elements that are distance 1 from the original element. This means you can add k elements before D goes down again. I think that if you're adding 2 elements at a time it goes down twice as fast. But I'm not sure, it may go down faster. So... my thoughts all point to a linear solution. y = mx + b y = D x = n m = f(n)  a function of n b = yintercept, what is D if k = 0. First, lets evaluate m. The maximum D is k, when n = 1, and the minimum D is 1 when n = 2^k/2. m = y1y2 / x1x2 m = (k1)/(12^k/2) ... some algebra ... m = [ (k1) / (2^{k1} 1) ] Now, b. The tricky part about b is that what does n = 0 means, both sets are empty? How far away are you from no element to no element? So instead we take a known point on the line (n=1, D=k) and knowing the slope, evaluate b. D = mx + b k = mn + b k  m = b b = k + [ (k1) / (2^{k1} 1) ] So.. putting it together. D = [ (k1) / (2^{k1} 1) ]n + k + [ (k1) / (2^{k1} 1) ] Hrm.... seems a little messy. Oh well this is just some thoughts assuming a linear solution. D


IP Logged 



D
Guest


Re: HARD: Hamming Distance Questions
« Reply #3 on: Jul 27^{th}, 2002, 12:06am » 
Quote Modify
Remove

Um... I just ran some numbers. It works ok for k = 1, and k = 2 and not quite as elegant for k = 3. The equation needs some fine tuning. It may be just to round down D. But I'll think about it a little more. D


IP Logged 



D
Guest


Re: HARD: Hamming Distance Questions
« Reply #4 on: Jul 27^{th}, 2002, 12:19am » 
Quote Modify
Remove

It's 5 minutes later, in the last 5 minutes I've thrown out several cans of caffeine cluttering my desk and replaced them with a fresh can.. and came up with another thought. It's probably not linear... The reason is the number of paths you can take goes up exponentially and I think that at every level you have more elements you could add while keeping the hamming distance the same than you had at the previous level. Although the distance still went down. I'm thinking primarily about different size sets because it's easier to visualize a solution and hoping equal size sets would be just an extension of this. BTW, if this is the case, the equation will be some sort of D = 1/k^{p}*n + c


IP Logged 



D
Guest


Re: HARD: Hamming Distance Questions
« Reply #5 on: Jul 27^{th}, 2002, 4:49pm » 
Quote Modify
Remove

I was out doing some homework and started working on this again. I tried experiementing with k = 4. For some reason, n = 1, D = 4 n = 2, D = 3 n = 3, D = 2. I couldn't come up with a set that did better at n = 3. I was working with equal sized sets such that if you add an element to x, you have to add one to y as well. Interestingly enough what I observed was that when choosing the next pair beyond n = 1, it's polar opposite would have a D = 4, the poles (0000, 1111), had D = 3, and any element I chose seemed to share 2 bits with the first chosen on the opposite set. D


IP Logged 



Bruce
Guest


Re: HARD: Hamming Distance Questions
« Reply #6 on: Jul 30^{th}, 2002, 4:31am » 
Quote Modify
Remove

Coming at this problem from the other side  consider the set of all binary strings of length k. How many of them can we select if we insist that any pair of select elements has a Hamming distance of at least d? Well, there are 2^{k1}.^{k}C_{d} (where ^{k}C_{d} is k!/d!(kd)!) pairs of elements with Hamming distance d  so we can select 2^{k1}.SUM(n=d, n=k, ^{k}C_{n}) elements before we are forced to break our constraint. Any help?


IP Logged 



pio
Newbie
Gender:
Posts: 4


Re: HARD: Hamming Distance Questions
« Reply #7 on: Jul 30^{th}, 2002, 4:35am » 
Quote Modify

I start with an hipotesis. I think that the number of bitstrings that are at a hamming distance of D of a given string is equal to the D position of the N row of the tartaglia triangle. That is C(n,d) For example: in N=3 D=0 > 1 bitstring D=1 > 3 bitstrings D=2 > 3 bitstrings D=3 < 1 bitstring I can't do it here, but if you draw a four dimensions hipercube (yes, you can) you may be able to see that this is also true. in N=4 D=0 > 1 bitstring D=1 > 4 bitstrings D=2 > 6 bitstrings D=3 > 4 bitstrings D=4 > 1 bitstring I guess this is true for every N, (anybody can prove it?) Then we can see also that we get an increase of the distance only for an odd N. With all this I've tried several aproachs and so long I've found this. My guess is that for any given K: D=K when n=2 D=(K+1) DIV 2 when 2<n<=2^(2+((K+1) MOD 2)) But i don't know how can I extend this for every K and every N. I think I'm in the right track. What do you think? Any suggestions?

« Last Edit: Jul 30^{th}, 2002, 4:52am by pio » 
IP Logged 



Steven Noble
Guest


Re: HARD: Hamming Distance Questions
« Reply #8 on: Jul 30^{th}, 2002, 5:40pm » 
Quote Modify
Remove

pio said "I think that the number of bitstrings that are at a hamming distance of D of a given string is equal to the D position of the N row of the tartaglia triangle. " later pio said "I guess this is true for every N, (anybody can prove it?)" I won't give you the proof for this statement because there is no reason you shouldn't be able to get it yourself. The easiest way to go about it is probably a recursive proof using one of the simplest characteristics of the triangle (be it tartaglia or pascal or Yanghui). Hope that's enough. As for the rest of the problem has anybody made a script that will give a bunch of data to look at? I had a number theory prof who used to say that the most important affect computers have had on mathematics is they can give mathematicians a lot of data to stare at while we search for an answer. (he had a much more elegant way of saying it... but he strongly believed that if u stare at something for long enough eventually the answer will come to you)


IP Logged 



cnmne
Newbie
Gender:
Posts: 1


Re: HARD: Hamming Distance Questions
« Reply #9 on: Jul 30^{th}, 2002, 6:31pm » 
Quote Modify

Solving for H, given N and K 1<K 1<N<2^K at N<2, H is undefined at N=2, H = K at 2^(K1)<N<=2^K, H = 1 at N>2^K, H=0 (must be duplicates) I would have to guess that it is a rounded logorithm with a little tweaking. There is obviously a relationship between 2^1...K1 and H but it is not exact.


IP Logged 



Nathan Hellweg
Guest


Re: HARD: Hamming Distance Questions
« Reply #10 on: Jul 30^{th}, 2002, 11:01pm » 
Quote Modify
Remove

YABB ate my last messageo so I'll be brief. If you have n strings, each string k bits long, the hamming distance is d, then,and if d is even, then this ineqality holds. n* sun(from i=0 to d/2)(k choose i) <= 2^k Moreover, if the inequality holds for some d, then there is a set of strings with the appropriate properties. The trick is finding a closed form for the sum/inverting the sum. I suspect that the sum can be expressed as an integral in a nice way, but I'm not quite there yet.


IP Logged 



oliver
Guest


Re: HARD: Hamming Distance Questions
« Reply #11 on: Jul 31^{st}, 2002, 1:06am » 
Quote Modify
Remove

Nathan, You came to that formula by looking at "Balls" around the points of the maximum set, right? With "Balls" I mean B_{d}(x) := {y  d(x,y) <= d} That's the way I found this formula, too  but with one small correction. You have to use balls with radius (d1)/2. If two points have Hamming distance d (d even), there's clearly a point between them which has Hamming distance d/2 to each of them, therefore belonging to both "Balls". That means that the Balls aren't disjunct and the inequality may not hold (these are all my problems, just speculating). Maybe you had another idea, anyway, but if that was your reasoning, maybe you know if this inequality is sharp enough, i.e. that we now just need to find the maximum d so that the inequality hold. I'm not so sure about that.


IP Logged 



oliver
Guest


Re: HARD: Hamming Distance Questions
« Reply #12 on: Jul 31^{st}, 2002, 11:03am » 
Quote Modify
Remove

FWIW, I wrote some small python methods, in order to get some numbers to work with. The problem is that I cannot prove ATM that my solutions are ideal, but for all instances I have tested it (k<=4) the solution seems right. Another problem is that I may simply have an error in my code ;>. Anyway, for all people interested in logical/mathematical thinking, python is a nice and clear language. I do something like Erasthostenes' Sieve. I'll explain it with the code (btw. it's python). For using it, you'll have to just replace all _'s with spaces, otherwise the indendation would have been destroyed. Code: from_operator_import_* def_build_space(k): ____space_=_{} ____for_i_in_range(0,2**k): ________space[i]=1 ____return_space def_build_ball(k,d): ____ball_1_=_[] ____for_i_in_range(0,k): ________ball_1.append(2**i) ____ball_=_{} ____for_i_in_ball_1:_ball[i]=1 ____for_i_in_range(2,d): ________d_ball={} ________for_j_in_ball_1: ____________for_k_in_ball.keys(): ________________d_ball[xor(j,k)]_=_1 ________for_l_in_d_ball.keys(): ____________ball[l]_=_1 ____if_ball.has_key(0):_del_ball[0] ____return_ball.keys() def_hs(k,d): ____space_=_build_space(k) ____ball_=_build_ball(k,d) ____for_i_in_space.keys(): ________if_space[i]: ____________for_j_in_ball: ________________space[xor(i,j)]=0 ____min_set_=_filter(lambda_i:space[i],_space.keys()) ____return_len(min_set),min_set for_k_in_range(2,6): ____for_d_in_range(2,k): ________a=hs(k,d) ________print_'k:_%i,_d:_%i,_n:_%i,_set:_%s'_%_(_(k,d)_+_a) 
 build_space(k): constructs a dictionary (hash table) of all binary strings of length k, represented as integers. build_ball(k,d): constructs a "ball" of hamming diameter d1 around 0 (remember, integer representation, this is the bitstring with just 0's), leaving out zero itself. IOW, this ball contains all bitstrings which have at max d1 nonzero bits. Remark: We'll use that for two given bitstrings x,y, the hamming distance between the less than d, if and only if x XOR y is element of this ball. hs(k,d): hamming_sieve. I'll let you figure out that one ;P Using: Code: for_k_in_range(2,6): ____for_d_in_range(2,k): ________a=hs(k,d) ________print_'k:_%i,_d:_%i,_n:_%i,_set:_%s'_%_(_(k,d)_+_a) 
 we get k: 3, d: 2, n: 4, set: [0, 3, 5, 6] k: 4, d: 2, n: 8, set: [0, 3, 5, 6, 9, 10, 12, 15] k: 4, d: 3, n: 2, set: [0, 7] k: 5, d: 2, n: 16, set: [0, 3, 5, 6, 9, 10, 12, 15, 17, 18, 20, 23, 24, 27, 29, 30] k: 5, d: 3, n: 4, set: [0, 7, 25, 30] k: 5, d: 4, n: 2, set: [0, 15] Here's a longer list: k: 3, d: 2, n: 4 k: 4, d: 2, n: 8 k: 4, d: 3, n: 2 k: 5, d: 2, n: 16 k: 5, d: 3, n: 4 k: 5, d: 4, n: 2 k: 6, d: 2, n: 32 k: 6, d: 3, n: 8 k: 6, d: 4, n: 4 k: 6, d: 5, n: 2 k: 7, d: 2, n: 64 k: 7, d: 3, n: 16 k: 7, d: 4, n: 8 k: 7, d: 5, n: 2 k: 7, d: 6, n: 2 k: 8, d: 2, n: 128 k: 8, d: 3, n: 16 k: 8, d: 4, n: 16 k: 8, d: 5, n: 4 k: 8, d: 6, n: 2 k: 8, d: 7, n: 2 k: 9, d: 2, n: 256 k: 9, d: 3, n: 32 k: 9, d: 4, n: 16 k: 9, d: 5, n: 4 k: 9, d: 6, n: 4 k: 9, d: 7, n: 2 k: 9, d: 8, n: 2 I'd be glad to hear criticisms, etc.


IP Logged 



Nathan Hellweg
Guest


Re: HARD: Hamming Distance Questions
« Reply #13 on: Jul 31^{st}, 2002, 11:09am » 
Quote Modify
Remove

on Jul 31^{st}, 2002, 1:06am, oliver wrote: That's the way I found this formula, too  but with one small correction. You have to use balls with radius (d1)/2. 
 You're right, I've got an off by one error, so it's (d1)/2 and odd d's. Really, I should be using balls with radius d1 and figuring out how much the get shared.


IP Logged 



oliver
Guest


Re: HARD: Hamming Distance Questions
« Reply #14 on: Jul 31^{st}, 2002, 11:09am » 
Quote Modify
Remove

Argh, just saw that in the code I used at least two variable names with underscores: ball_1 and d_ball Please consider that if you do the search&replace thing. Sorry.


IP Logged 



Nathan Hellweg
Guest


Re: HARD: Hamming Distance Questions
« Reply #15 on: Jul 31^{st}, 2002, 8:11pm » 
Quote Modify
Remove

I haven't gone through and done the algebra, but it looks like there is an approximation that involves the inverse error function (yuck) that can be derived by using the stirling approximation for the binomial terms, and then integrating instead of taking the sum.


IP Logged 



william wu
wu::riddles Administrator
Gender:
Posts: 1291


Re: HARD: Hamming Distance Questions
« Reply #16 on: Aug 3^{rd}, 2002, 11:54pm » 
Quote Modify

OK there's some heavy analyses going on, so it's going to take me a while to process all this information skeptically. I'll get back to you guys later, perhaps after my stupid GRE. Others are welcome to help me comb through the logic though. I appreciate all the effort that's going into this. In the meantime, I'll be memorizing obfuscated words


IP Logged 
[ wu ] : http://wuriddles.com / http://forums.wuriddles.com



Nathan Hellweg
Guest

I just had an epiphany. How does this sound: You've got to maintain a hamming distance of d. Let's make the (likely) assumption that I can create a situation where the distance is always a multiple of d. I need to come up witha good explanation for this assumption, but it seems reasonable to me right now. Then if you have k digits, then then you can pack in 2^(floor k/d) numbers. So n=2^(floor k/d) Now, if you're given n then you can pack with the distance: floor(k/log_2(n)) under the same circumstances. so, if that assumption holds, then: d=floor(k/log_2(n)). Clearly this is optimal when k/log_2(n) is an integer... Now I need to prove the assumption :/ Of course it's prolly strong enough for you CS types.


IP Logged 



Nathan Hellweg
Guest


Re: HARD: Hamming Distance Questions
« Reply #18 on: Aug 5^{th}, 2002, 11:26am » 
Quote Modify
Remove

Please Ignore Previous Post %)


IP Logged 



oliver
Newbie
Posts: 25


Re: HARD: Hamming Distance Questions
« Reply #19 on: Aug 5^{th}, 2002, 11:59am » 
Quote Modify

Nathan, I see you already retraced that, but maybe you're on to something. First, the number of bitstrings of length n is a group, with group operation XOR. And it seem like the minimal sets I provided are always subgroups of these sets, that means that they also are groups with group operation XOR for themselves. That is consistent with your claim (hope ) that the distance is always a multiple of d. If the above could be proved, we would at least know that k is a factor of 2^n. As a consequence, we know at least that k for a optimal group is of the form 2^i, for an integer i.


IP Logged 



Nathan Hellweg
Guest


Re: HARD: Hamming Distance Questions
« Reply #20 on: Aug 6^{th}, 2002, 11:11pm » 
Quote Modify
Remove

Here's the result of the foray into estimation: We had: n*\sum_{i=0}{(d1)/2}(k choose i) <= 2^k Now, there is an estimation that goes like this: n*\intgral_{\infty}{(dk/2)/k*8}((2\pi)^{.5}*e^{x^2/2}dx) <=2^k By substitution we get \integral ((2\pi)^{.5}*e^{x^2/2}dx) = \pi^{0.5} * erf(x*2^{.5}) since erf(\infty\=0) we get n*pi^{0.5} * erf(((dk/2)/k*8 )*2^{.5}) <= 2^k Some algebra gives ierf(2^k*\pi^{.5}/n)>=((dk/2)/k*8 )*2^{.5} More algebra gives ierf(2^k*\pi^{.5}/n)*2^{3.5}*k+k/2>=d Where ierf is the inverse error function and erf is the error function. Which is probably a good estimate for large k,d,and n.

« Last Edit: Aug 7^{th}, 2002, 3:27pm by william wu » 
IP Logged 



NoJoy
Newbie
Posts: 3


Re: HARD: Hamming Distance Questions
« Reply #21 on: Aug 15^{th}, 2002, 12:44pm » 
Quote Modify

I'm going to step through my [incomplete] analysis, even though I'm pretty sure I'm just restating what others have said before. Let's define a closed ball of radius d around a bitstring s of length k: B(k,s,d) = {t  H(s,t) <= d} How many bitstrings are in such a ball? By symmetry, B(k,s,d) = B(k,0,d), where 0 is the string of k zeroes. But a bitstring is at Hamming distance i from 0 iff it has i onebits, so the number of bitstrings at Hamming distance i from 0 is the number of ways of choosing i onebits from k bits. So B(k,0,d) = sum(0 <= i <= d : k choose i). Let's denote this by f(k,d). [As an aside, the terms of this sum are the previously mentioned elements of the kth row of the XXX triangle, and if I understand the approximation thread, there is a smooth approximation to this upanddown staircase which can be integrated to produce the ith partial sum, right?] So suppose we have a set of n bitstrings of length k with minimum pairwise Hamming distance d. For odd values of d, we can form a ball of radius (d1)/2 around each element of the set which is disjoint from each other such ball, so n*f(k,d) <= 2^k, and f(k,d) <= (2^k)/n. For even values of d, things are not so clear, because the balls overlap. But I'd like to think that if we account for the overlap, we're left with finding the maximum d satisfying some inequality. [As an aside, if I follow the approximation thread, this corresponds to using an inverse function, right?] So I'm left with three unanswered questions: 1. Is is actually always possible to pack the balls this tight, i.e. is the inequality tight enough? 2. What is required to account for the overlap in the case of even values of d? 3. Would an expression like d = {max ...} count as a "closed form solution"?


IP Logged 



oliver
Newbie
Posts: 25


Re: HARD: Hamming Distance Questions
« Reply #22 on: Aug 15^{th}, 2002, 1:40pm » 
Quote Modify

Quote:3. Would an expression like d = {max ...} count as a "closed form solution"? 
 Nojoy, I fear that this is never a closed form solution, because if you think a little bit it's quite easy to write down exactly the definition of d with this. And btw. you should really read the other threads IMO, because they answer some of your questions, and you'll see that the code I posted does exactly use your B(...) ideas.


IP Logged 



James Fingas
Guest


Re: HARD: Hamming Distance Questions
« Reply #23 on: Aug 19^{th}, 2002, 2:04pm » 
Quote Modify
Remove

Uhoh! I have run into a little snag I thought I'd let you know about. The angle I've been taking on this problem is similar to Oliver's angle (but without the code). For a given length k and a distance D, find the maximum possible number of messages n. I have been generating these messages by hand (it's pretty hard to do). At first, I found that n was always a power of 2, but I can disprove this by counterexample. Take k=8, D=3. First of all using the "ball" argument, there can be a maximum of 28 messages: total messages for k=8: 8^2 = 256 each message has 8 neighbours (messages only one bit away), plus the message itself. 256/9=28.4 Since none of these neighbours can be shared, then it must hold true that we have less than 28 messages. However, examine the following 20 messages, and please verify that they are all at least 3 bits different. 0000 0000 1110 0000 0011 1000 0000 1110 1000 0011 1001 0100 0010 0101 0100 1001 0101 0010 1010 1010 0101 0101 1101 1000 0110 1100 0011 0110 0001 1011 1000 1101 1100 0110 0110 0011 1011 0001 1111 1111 Therefore, there are between 20 and 28 messages that you can fit into 8 bits with a minimum Hamming distance of 3. Since there are no powers of 2 between 20 and 28, then obviously n isn't always a power of 2! So Oliver, I guess that you have to revise your list to read k: 8, d: 2, n: 128 k: 8, d: 3, n: 20(+?) etc. This has given me a good reason to look over all my calculations, and now I'm not sure in what direction to proceed. All I know for sure is that: 1) for k < D, you can't send any messages 2) for k >= D, you can send at least two messages 3) for 2k >= 3D, you can sent at least four messages 4) something similar can probably be worked out for 8 messages, but it gets complicated 5) for D = 2, you can send 2^(k1) messages 6) for D = 1, you can send 2^k messages (I hope this doesn't surprise anyone ;) 7) if you can send n messages with k=k1, D=D1, then you can send at least n messages with k=2*k1, D=2*D1. Sometimes, you can send more 8) if you can send n messages with k=k1, D=D1, then you can send at most n messages with k=k1+1, D=D1+1. (I'm pretty sure of this) 9) if you can send n messages with k=k1, D=D1, then you can send at most 2n messages with k=k1+1, D=D1. (I'm pretty sure of this one too) Anyways, I'm sorry to throw a monkeywrench into the whole works, but I've found that we can't restrict n to be powers of 2. In the communications world, I don't think this problem was ever solved. It turns out that thinking in terms of a Hamming distance usually doesn't suit the problem. The idea behind Hamming Distance was to build redundancy into the coding of signals, so that if you got a bit wrong, you would know you got it wrong, and possibly be able to correct it. However, since most signals aren't sent out in binary, but rather in mary signalling, the Hamming distance between signals won't match up with the actual probabilities of bits having errors. So they moved on to trelliscoded modulation, then on to Turbo Codes and stuff, without (AFAIK) solving the Hamming distance problem. From a communications perspective, you can encode signals by starting with m message bits, then tacking on (km) parity bits. By choosing the parity bits correctly, you can make the Hamming distance between all possible messages quite large (bounded by (km+1)). In practice, choosing those parity bits is half luck, half magic, and all you could do was to catalogue what parity bits worked best. When trelliscoded modulation came around, the communications field did an endrun around the problem, and started to use probability (a la maximum aposteriori probability, or MAP receivers) to calculate which message was most likely transmitted. Good Luck! James Fingas


IP Logged 



oliver
Newbie
Posts: 25


Re: HARD: Hamming Distance Questions
« Reply #24 on: Aug 19^{th}, 2002, 11:38pm » 
Quote Modify

Nice work James! Your example is indeed correct (I tested it with small program). That's pretty bad. It certainly kills the thesis that the minimal set is always a subgroup (or trivially can be 1:1 mapped to one). Nasty. Wow, I wouldn't have thought that the algorithm I implemented was simply wrong. But there's no need to excuse, what you did was certainly very helpful.


IP Logged 



