Can you prove 1 = 2?



I don't think you guys get what I am saying.

When you have an algebraic equation it is supposed to be representative of all numbers, therefore it is a valid equation. However if this does not work for one instance (the one described) then it is proved not to work overall.

An example would be this equation:

x=(a-b)/(a-b)

where in ALMOST all cases x=1 and the equation is correct

BUT

if a=b then the equation is incorrect and does not work as x cannot be described.

but as an algebraic expression it is perfectly correct. How? Does this mean that for all algebraic formula to be correct you need to add in real numbers to prove it? If so then doesn't this show a flaw in algebra?
 
Actually, the correct algebraic expression would state a =/= b as a limit. Most of the time this isn't stated because the a = b answer is trivial.

But still, the equation you gave below works perfectly if a = b, it's telling you the answer- it's just that the answer isn't defined, so how can you expect it to give you a definite one, how is that a flaw of algebra?

Sketch the plot of the equation and you'll see what you're asking for.

(BTW, If you haven't started using limits in mathematics, I'm also guessing you probably haven't had much education in the subject. It's standard above A-level (UK), High School/Freshman College (US) level.)

I don't think you guys get what I am saying.

When you have an algebraic equation it is supposed to be representative of all numbers, therefore it is a valid equation. However if this does not work for one instance (the one described) then it is proved not to work overall.

An example would be this equation:

x=(a-b)/(a-b)

where in ALMOST all cases x=1 and the equation is correct

BUT

if a=b then the equation is incorrect and does not work as x cannot be described.

but as an algebraic expression it is perfectly correct. How? Does this mean that for all algebraic formula to be correct you need to add in real numbers to prove it? If so then doesn't this show a flaw in algebra?
 
Actually, the correct algebraic expression would state a =/= b as a limit. Most of the time this isn't stated because the a = b answer is trivial.


Correction, the a=b isn't strictly speaking a trivial solution, just not a useful one.
 
A-Level Maths and maths in Imperial College London. Sorry

As limits are not defined it makes the equation incorrect.
 
A-Level Maths and maths in Imperial College London. Sorry

As limits are not defined it makes the equation incorrect.

WTF that's random, I graduated IC in Physics.

But to the point, what exactly is incorrect about the equation? It's telling you precisely what you've asked of it.

Edit: Ok, reread your post above and I see what you're getting at. But it's hardly a refutation of algebra, but just an acknowledgment that not all numbers that exist are real.
 
No, it's a flaw in Algebra. The equation works perfectly in Algebra, it's when you put numbers in that it doesn't work i.e. Algebra is flawed so doesn't work

It seems to me that you're repeating what your teacher told you and passing it off as your own opinion.

And it's still stupid, GTFO.
 
Here, let me try. Don't think it would make any sense but no harm to give it a try.

A simple formula:

Let 1=a and 2=b

Now,

0 = 0
a x 0 = b x 0

Canceling 0 as it is present in both sides.

Thus, a = b or 1 = 2 ...

Have I do it correctly? :hitit_sml:
 
Here, let me try. Don't think it would make any sense but no harm to give it a try.

A simple formula:

Let 1=a and 2=b

Now,

0 = 0
a x 0 = b x 0

Canceling 0 as it is present in both sides.

Thus, a = b or 1 = 2 ...

Have I do it correctly? :hitit_sml:

0 = 0
(a x b) x 0 = (a x b) x 0
 
I don't think you guys get what I am saying.

When you have an algebraic equation it is supposed to be representative of all numbers, therefore it is a valid equation. However if this does not work for one instance (the one described) then it is proved not to work overall.

An example would be this equation:

x=(a-b)/(a-b)

where in ALMOST all cases x=1 and the equation is correct

BUT

if a=b then the equation is incorrect and does not work as x cannot be described.

but as an algebraic expression it is perfectly correct. How? Does this mean that for all algebraic formula to be correct you need to add in real numbers to prove it? If so then doesn't this show a flaw in algebra?

I give up. You're absolutely right. Algebra is fatally flawed because of what you posted above. In fact, I think I will stop using algebra because of this.

Here, let me try. Don't think it would make any sense but no harm to give it a try.

A simple formula:

Let 1=a and 2=b

Now,

0 = 0
a x 0 = b x 0

Canceling 0 as it is present in both sides.

Thus, a = b or 1 = 2 ...

Have I do it correctly? :hitit_sml:

"Cancelling 0" is the same as dividing by zero. You can't do that.
 
I don't think you guys get what I am saying.

When you have an algebraic equation it is supposed to be representative of all numbers, therefore it is a valid equation. However if this does not work for one instance (the one described) then it is proved not to work overall.

An example would be this equation:

x=(a-b)/(a-b)

where in ALMOST all cases x=1 and the equation is correct

BUT

if a=b then the equation is incorrect and does not work as x cannot be described.

but as an algebraic expression it is perfectly correct. How? Does this mean that for all algebraic formula to be correct you need to add in real numbers to prove it? If so then doesn't this show a flaw in algebra?

If algebra doesn't work how the hell did we figure out how to get to the moon?
 
If algebra doesn't work how the hell did we figure out how to get to the moon?
By using calculus? :)

As a matter of fact, most of the confusion in this thread seems to be the result of people not understanding that algebra is just a small subset of math with limited application.
 
WTF that's random, I graduated IC in Physics.

But to the point, what exactly is incorrect about the equation? It's telling you precisely what you've asked of it.

Edit: Ok, reread your post above and I see what you're getting at. But it's hardly a refutation of algebra, but just an acknowledgment that not all numbers that exist are real.

How weird. I moved from Maths to Physics :)
 
By using calculus? :)

As a matter of fact, most of the confusion in this thread seems to be the result of people not understanding that algebra is just a small subset of math with limited application.

i'd say algebra has ubiquitous application.
 
By using calculus? :)

As a matter of fact, most of the confusion in this thread seems to be the result of people not understanding that algebra is just a small subset of math with limited application.

you realize calculus relies on the existence of algebra, correct?