He's just mad because he got held back in school for failure to understand basic concepts of arithmetic.
What the fuck did you do here? Those things aren't equal at all.
I don't think you guys get what I am saying.
When you have an algebraic equation it is supposed to be representative of all numbers, therefore it is a valid equation. However if this does not work for one instance (the one described) then it is proved not to work overall.
An example would be this equation:
x=(a-b)/(a-b)
where in ALMOST all cases x=1 and the equation is correct
BUT
if a=b then the equation is incorrect and does not work as x cannot be described.
but as an algebraic expression it is perfectly correct. How? Does this mean that for all algebraic formula to be correct you need to add in real numbers to prove it? If so then doesn't this show a flaw in algebra?
Actually, the correct algebraic expression would state a =/= b as a limit. Most of the time this isn't stated because the a = b answer is trivial.
It's a flaw that you can't divide by zero?
Being able to divide by zero would wreck, uh, pretty much everything.
A-Level Maths and maths in Imperial College London. Sorry
As limits are not defined it makes the equation incorrect.
No, it's a flaw in Algebra. The equation works perfectly in Algebra, it's when you put numbers in that it doesn't work i.e. Algebra is flawed so doesn't work
Here, let me try. Don't think it would make any sense but no harm to give it a try.
A simple formula:
Let 1=a and 2=b
Now,
0 = 0
a x 0 = b x 0
Canceling 0 as it is present in both sides.
Thus, a = b or 1 = 2 ...
Have I do it correctly? :hitit_sml:
I don't think you guys get what I am saying.
When you have an algebraic equation it is supposed to be representative of all numbers, therefore it is a valid equation. However if this does not work for one instance (the one described) then it is proved not to work overall.
An example would be this equation:
x=(a-b)/(a-b)
where in ALMOST all cases x=1 and the equation is correct
BUT
if a=b then the equation is incorrect and does not work as x cannot be described.
but as an algebraic expression it is perfectly correct. How? Does this mean that for all algebraic formula to be correct you need to add in real numbers to prove it? If so then doesn't this show a flaw in algebra?
Here, let me try. Don't think it would make any sense but no harm to give it a try.
A simple formula:
Let 1=a and 2=b
Now,
0 = 0
a x 0 = b x 0
Canceling 0 as it is present in both sides.
Thus, a = b or 1 = 2 ...
Have I do it correctly? :hitit_sml:
I don't think you guys get what I am saying.
When you have an algebraic equation it is supposed to be representative of all numbers, therefore it is a valid equation. However if this does not work for one instance (the one described) then it is proved not to work overall.
An example would be this equation:
x=(a-b)/(a-b)
where in ALMOST all cases x=1 and the equation is correct
BUT
if a=b then the equation is incorrect and does not work as x cannot be described.
but as an algebraic expression it is perfectly correct. How? Does this mean that for all algebraic formula to be correct you need to add in real numbers to prove it? If so then doesn't this show a flaw in algebra?
By using calculus?If algebra doesn't work how the hell did we figure out how to get to the moon?
WTF that's random, I graduated IC in Physics.
But to the point, what exactly is incorrect about the equation? It's telling you precisely what you've asked of it.
Edit: Ok, reread your post above and I see what you're getting at. But it's hardly a refutation of algebra, but just an acknowledgment that not all numbers that exist are real.
By using calculus?
As a matter of fact, most of the confusion in this thread seems to be the result of people not understanding that algebra is just a small subset of math with limited application.
By using calculus?
As a matter of fact, most of the confusion in this thread seems to be the result of people not understanding that algebra is just a small subset of math with limited application.