Author 
Topic: Computers that Lie (Read 2607 times) 

towr
wu::riddles Moderator Uberpuzzler
Some people are average, some are just mean.
Gender:
Posts: 13730


Re: Computers that Lie
« Reply #1 on: Jul 6^{th}, 2005, 8:10am » 
Quote Modify

Computers lie each time they accuse me of a user error Eventually when computers get smarter, they will have to lie to be able to intereact comfortably with people. Or perhaps not so much lie, as not tell the whole truth and steer away from the subject. It's hard to estimate how little of what computers tell us is exactly true now anyway. Anything from knowledgebases is only as good as the person that put the knowledge in there.


IP Logged 
Wikipedia, Google, Mathworld, Integer sequence DB



Grimbal
wu::riddles Moderator Uberpuzzler
Gender:
Posts: 7519


Re: Computers that Lie
« Reply #2 on: Jul 6^{th}, 2005, 3:33pm » 
Quote Modify

When a computer tells me 2/3 is 0.6667, it is lying, just lazy to give me the real complete figure. Or when it tells me that 2^2 = 4...


IP Logged 



Icarus
wu::riddles Moderator Uberpuzzler
Boldly going where even angels fear to tread.
Gender:
Posts: 4863


Re: Computers that Lie
« Reply #3 on: Jul 7^{th}, 2005, 5:51pm » 
Quote Modify

Question: Is it lying when you tell someone something that is, in a sense, true, but at deeper level is false? We do this all the time when teaching. One example: When I was first taught to subtract, I was told that in order to do so, the first number had to be greater than or equal to the second. Because there was no such thing as 23. Later I was told, yes there is! But, you can't take its square root. That doesn't exist. Later still, I am told: Yes, you can! A less well known example. When I first learned physics, we spoke often about events at different locations occuring at the same time. Much physical calculation made use of this concept. But when I started studying relativity, I discover that this concept is false. Events that appear to be simultaneous to one observer will not be for other observers. I quickly learn that there is no "universal time" to which all events can be applied. Then I reveled in this knowledge, chuckling in condescension at the uneducated who believe in simultaneity! Finally, as I came to understand the ins and outs of relativity, I started to look at standard cosmological models. What do I find? THEY ALL HAVE A UNIVERSAL TIMELINE!!! And the realization comes to me: while matter travels in numerous directions, there is an average position/motion to the whole of matter in the universe. This average nicely defines a "superqualified" observer, and a universal timeline to which all others can be compared. C'est la vie! If it is okay for us do this directly, I don't see why it is any worse to do it through a computer. And make no mistake: at this point in time, if a "computer lies", it isn't the computer, but the programmer or designer who is lying. After all, if we find a lie in a book, do we accuse the book itself of lying, or the author?


IP Logged 
"Pi goes on and on and on ... And e is just as cursed. I wonder: Which is larger When their digits are reversed? "  Anonymous



Grimbal
wu::riddles Moderator Uberpuzzler
Gender:
Posts: 7519


Re: Computers that Lie
« Reply #4 on: Jul 8^{th}, 2005, 2:05am » 
Quote Modify

Anyway, I see 2 examples: An obvious one is when "you know which" PC operating system hides system files to prevent users from messing with them. Another one was in an article about user interface design. It said people feel more confident in an answer that comes after a small delay than in an answer that comes immediately. So the author proposed to add artificial small delays to increase trust in the system. Well, it was before the internet.


IP Logged 



amichail
Senior Riddler
Posts: 450


Re: Computers that Lie
« Reply #5 on: Jul 8^{th}, 2005, 2:39am » 
Quote Modify

Would it make sense to give a simpler proof to a theorem that is wrong but is reminiscent of a much more complicated proof that is correct? And if so, how would you build a computer program to simplify formally specified proofs to make them easier to understand  even if they would not be quite correct?

« Last Edit: Jul 8^{th}, 2005, 2:41am by amichail » 
IP Logged 
DropZap  a new kind of block elimination game



towr
wu::riddles Moderator Uberpuzzler
Some people are average, some are just mean.
Gender:
Posts: 13730


Re: Computers that Lie
« Reply #6 on: Jul 8^{th}, 2005, 3:12am » 
Quote Modify

on Jul 8^{th}, 2005, 2:39am, amichail wrote:Would it make sense to give a simpler proof to a theorem that is wrong but is reminiscent of a much more complicated proof that is correct? 
 Depends on who the proof is meant for. You can't submit a simplified wrong proof to a CS/math journal and expect them to accept it. But it may be acceptable as a guide to explain why, say, a program you wrote does what you intend it to. Quote:And if so, how would you build a computer program to simplify formally specified proofs to make them easier to understand  even if they would not be quite correct? 
 The standard technique to provide correct but simplified proofs is to break it up into lemmas and theorems, and not provide the proof based solely on axioms. If you can proof every lemma and theoreom based on other leammas theorems and axioms, then the whole must be true given the axioms are consistent. This also makes it faster to make subsequent proofs, since the prover can just reuse old lemmas it has proven before, without going through the same thing again and again. (I'm actually working on making a theorem prover. If anyone has any tips or advice, it'd be welcome )

« Last Edit: Jul 8^{th}, 2005, 3:14am by towr » 
IP Logged 
Wikipedia, Google, Mathworld, Integer sequence DB



