wu :: forums (http://www.ocf.berkeley.edu/~wwu/cgi-bin/yabb/YaBB.cgi)
riddles >> general problem-solving / chatting / whatever >> Computers that Lie
(Message started by: amichail on Jul 6th, 2005, 4:03am)

Title: Computers that Lie
Post by amichail on Jul 6th, 2005, 4:03am
What do you think of this?

http://clevercs.org/?module=articles&func=display&ptid=1&aid=476

Title: Re: Computers that Lie
Post by towr on Jul 6th, 2005, 8:10am
Computers lie each time they accuse me of a user error  ;D

Eventually when computers get smarter, they will have to lie to be able to intereact comfortably with people. Or perhaps not so much lie, as not tell the whole truth and steer away from the subject.

It's hard to estimate how little of what computers tell us is exactly true now anyway. Anything from knowledgebases is only as good as the person that put the knowledge in there.

Title: Re: Computers that Lie
Post by Grimbal on Jul 6th, 2005, 3:33pm
When a computer tells me 2/3 is 0.6667, it is lying, just lazy to give me the real complete figure.

Or when it tells me that -2^2 = 4...

Title: Re: Computers that Lie
Post by Icarus on Jul 7th, 2005, 5:51pm
Question: Is it lying when you tell someone something that is, in a sense, true, but at deeper level is false? We do this all the time when teaching.

One example: When I was first taught to subtract, I was told that in order to do so, the first number had to be greater than or equal to the second. Because there was no such thing as 2-3. Later I was told, yes there is! But, you can't take its square root. That doesn't exist. Later still, I am told: Yes, you can!

A less well known example. When I first learned physics, we spoke often about events at different locations occuring at the same time. Much physical calculation made use of this concept.
But when I started studying relativity, I discover that this concept is false. Events that appear to be simultaneous to one observer will not be for other observers. I quickly learn that there is no "universal time" to which all events can be applied.

Then I reveled in this knowledge, chuckling in condescension at the uneducated who believe in simultaneity! Finally, as I came to understand the ins and outs of relativity, I started to look at standard cosmological models. What do I find? THEY ALL HAVE A UNIVERSAL TIMELINE!!! And the realization comes to me: while matter travels in numerous directions, there is an average position/motion to the whole of matter in the universe. This average nicely defines a "superqualified" observer, and a universal timeline to which all others can be compared.

C'est la vie!

If it is okay for us do this directly, I don't see why it is any worse to do it through a computer.

And make no mistake: at this point in time, if a "computer lies", it isn't the computer, but the programmer or designer who is lying. After all, if we find a lie in a book, do we accuse the book itself of lying, or the author?

Title: Re: Computers that Lie
Post by Grimbal on Jul 8th, 2005, 2:05am
Anyway, I see 2 examples:

An obvious one is when "you know which" PC operating system  hides system files to prevent users from messing with them.

Another one was in an article about user interface design.  It said people feel more confident in an answer that comes after a small delay than in an answer that comes immediately.  So the author proposed to add artificial small delays to increase trust in the system.

Well, it was before the internet.

Title: Re: Computers that Lie
Post by amichail on Jul 8th, 2005, 2:39am
Would it make sense to give a simpler proof to a theorem that is wrong but is reminiscent of a much more complicated proof that is correct?

And if so, how would you build a computer program to simplify formally specified proofs to make them easier to understand -- even if they would not be quite correct?

Title: Re: Computers that Lie
Post by towr on Jul 8th, 2005, 3:12am

on 07/08/05 at 02:39:50, amichail wrote:
Would it make sense to give a simpler proof to a theorem that is wrong but is reminiscent of a much more complicated proof that is correct?
Depends on who the proof is meant for. You can't submit a simplified wrong proof to a CS/math journal and expect them to accept it.
But it may be acceptable as a guide to explain why, say, a program you wrote does what you intend it to.


Quote:
And if so, how would you build a computer program to simplify formally specified proofs to make them easier to understand -- even if they would not be quite correct?
The standard technique to provide correct but simplified proofs is to break it up into lemmas and theorems, and not provide the proof based solely on axioms.
If you can proof every lemma and theoreom based on other leammas theorems and axioms, then the whole must be true given the axioms are consistent.
This also makes it faster to make subsequent proofs, since the prover can just reuse old lemmas it has proven before, without going through the same thing again and again.

(I'm actually working on making a theorem prover. If anyone has any tips or advice, it'd be welcome ;D)



Powered by YaBB 1 Gold - SP 1.4!
Forum software copyright © 2000-2004 Yet another Bulletin Board