Massassi Forums Logo

This is the static archive of the Massassi Forums. The forums are closed indefinitely. Thanks for all the memories!

You can also download Super Old Archived Message Boards from when Massassi first started.

"View" counts are as of the day the forums were archived, and will no longer increase.

ForumsDiscussion Forum → Computer Science and Math and Stuff
1234567891011121314151617181920212223242526272829303132333435
Computer Science and Math and Stuff
2018-01-21, 12:02 AM #121
Originally posted by Reverend Jones:
but which still require testing to verify


Unless (emphasis added) your name is Albert Einstein! :P

Quote:
Today, we do experiments in our heads. The University of Houston's College of Engineering presents this series about the machines that make our civilization run, and the people whose ingenuity created them.

The late 19th-century German philosopher Ernst Mach added a phrase to our language. It was "thought experiment." It's a wonderfully contradictory idea.

One scientist observes and measures nature. Another writes mathematical theories. Experiment and thought seem to be separate roads to learning. Mach walked the experimenter's road. But he said we can also experiment in our heads, not just in nature.

He said, look what Galileo did. Aristotle thought heavy things should fall faster than light ones. He thought a stone should fall faster than a feather. So Galileo said, "Let's glue a feather to a stone and drop it. Together, they weigh more than either the feather or the stone alone. If Aristotle's right, they'll fall faster together than either would fall alone. But who could believe the feather wouldn't slow the stone? Aristotle has to be wrong."

That's what we call a thought experiment. Galileo did the experiment in his head. Mach made a funny argument to justify doing that. He said our intuition is accurate because evolution has shaped it. We shrink from the absurd. That's what Galileo was doing when he said a feather can't speed the fall of a stone.

Enter now the grand master of thought experiments, Albert Einstein. Einstein was Mach's disciple. He created a great theater of thought experiments to explain relativity. But relativity flies in the teeth of our intuition. He talked about throwing balls and reading clocks on railroad trains and elevators. His conclusions had nothing to do with intuition.

Mach finally cracked under the weight of Einstein's admiration. Relativity theory was, he wrote, "growing more and more dogmatic." For Mach, a theory was only a framework for our measurements. Einstein figured that you first wrote correct theories. Then the data would simply fall into place.

In 1919 Einstein shrugged off an important telegram. Someone wired that they'd seen rays of light bend under the influence of gravity. That's what relativity predicts. Einstein was unimpressed. If light hadn't bent, he said, then "I'd have been sorry for the dear Lord. The theory is correct!"

So Einstein took Mach's thought experiments and used them in ways Mach never intended. He used them to show us a world that's far more subtle and surprising than anything our intuition ever prepared us for. Mach's ideas about thought experiments finally faded because the world is much stranger than our capacity to imagine it.


https://www.uh.edu/engines/epi576.htm
2018-01-23, 2:36 PM #122
TAing calculus the 2nd time around feels so much more smooth.
2018-01-23, 2:37 PM #123
TAing circuits the fourth time around feels like ****
I had a blog. It sucked.
2018-01-23, 2:59 PM #124
Won't be me! Next year I start instructing!

Woo!
2018-01-24, 11:54 AM #125
so it turns out, there's actually a pretty easy proof that if f=f' and f is analytic, then f=ce^x for some constant c

i always suspected that f=f' meant either f=0 or f=ce^x, but i never saw such an elegant proof of uniqueness

also makes sense why it's basically the way to solve differential equations nontrivially, as having an eigenfunction seems pretty durn important

[https://i.imgur.com/AcKAVK1.png]

probably isn't special to people who are smart
2018-01-24, 11:56 AM #126
i'm not certain though about the truth of this statement outside of analytic functions, nor about the analyticity of solutions to differential equations
2018-01-24, 1:49 PM #127
On second thought, it's not particularly surprising after realizing how Taylor expansion works. Don't care, insights are insights.
2018-01-24, 2:42 PM #128
Originally posted by Reid:
so it turns out, there's actually a pretty easy proof that if f=f' and f is analytic, then f=ce^x for some constant c

i always suspected that f=f' meant either f=0 or f=ce^x, but i never saw such an elegant proof of uniqueness

also makes sense why it's basically the way to solve differential equations nontrivially, as having an eigenfunction seems pretty durn important

[https://i.imgur.com/AcKAVK1.png]

probably isn't special to people who are smart


https://www.reddit.com/r/math/comments/r3w8r/any_other_functions_than_ex_and_0_that_are/
2018-01-24, 2:47 PM #129
A First Course in Calculus, Serge Lang



https://books.google.com/books?id=vyDUBwAAQBAJ&lpg=PP1&dq=lang%20calculus&pg=PA242#v=onepage&q&f=false
2018-01-24, 2:49 PM #130
Originally posted by Reid:
i'm not certain though about the truth of this statement outside of analytic functions, nor about the analyticity of solutions to differential equations


Originally posted by Reid:
On second thought, it's not particularly surprising after realizing how Taylor expansion works. Don't care, insights are insights.


Quotient rule.
2018-01-24, 4:41 PM #131
Originally posted by Reverend Jones:
Quotient rule.


This is the exact sort of uninsightful Bourbakian proof you were deriding a few days ago, though. Neat, but uninsightful.
2018-01-24, 5:05 PM #132
It's called 'calculus' for a reason.

Newton used it in secret but recast the Principia in the language of geometry in order to avoid criticism.
2018-01-24, 5:06 PM #133
Also if you read that Reddit page, apparently the use of the quotient rule in this way depends on a much more general and snobby theorem.
2018-01-24, 5:17 PM #134
https://www.reddit.com/r/math/comments/r3w8r/any_other_functions_than_ex_and_0_that_are/c42w3ui/

http://en.wikipedia.org/wiki/Picard%E2%80%93Lindel%C3%B6f_theorem

...but as dm287 wrote, you can just use the mean value theorem. Which iirc is also the key theorem justifying Taylor's Theorem, which seems closer to your approach. I do admit that using Taylor's Theorem feels quite majestic (physicists love it).
2018-01-24, 5:29 PM #135
Originally posted by Reid:
This is the exact sort of uninsightful Bourbakian proof you were deriding a few days ago, though. Neat, but uninsightful.


Also this criticism is really unimpressive to me. Lame. You don't like the simpler and easier to understand route?
2018-01-24, 6:00 PM #136
Anyway, your idea is very nice too. But analyticity is quite a strong assumption, isn't it?
2018-01-24, 7:03 PM #137
Originally posted by Reverend Jones:
Also this criticism is really unimpressive to me. Lame. You don't like the simpler and easier to understand route?


Easier to follow, sure. But it tells you nothing about why that's the case.
2018-01-24, 7:05 PM #138
Originally posted by Reverend Jones:
Anyway, your idea is very nice too. But analyticity is quite a strong assumption, isn't it?


Most functions we deal with that are smooth are analytic, smooth but nowhere analytic functions exist but are pretty much only used as counter-examples.
2018-01-24, 7:10 PM #139
Most functions aren't even continuous. :D
2018-01-24, 7:11 PM #140
Originally posted by Reverend Jones:
Most functions aren't even continuous. :D


Yup, most functions that we actively deal with aren't though!
2018-01-24, 7:11 PM #141
Originally posted by Reid:
Easier to follow, sure. But it tells you nothing about why that's the case.


If you have an extra hypothesis, it's not even the same theorem.
2018-01-24, 7:11 PM #142
Originally posted by Reid:
Yup, most functions that we actively deal with aren't though!


You're using the royal we like my brain is a math textbook you are writing. I'm not even a mathematician.
2018-01-24, 7:12 PM #143
Originally posted by Reid:
Easier to follow, sure. But it tells you nothing about why that's the case.


Is the version you gave constructive in some way? Connected with a larger theory that explains how to solve a problem connected with the theorem? Or is it just that Taylor series are more convenient? Just curious.
2018-01-24, 7:14 PM #144
Quote:
also makes sense why it's basically the way to solve differential equations nontrivially,


Okay, I guess this is why you were interested.
2018-01-24, 7:20 PM #145
Originally posted by Reverend Jones:
If you have an extra hypothesis, it's not even the same theorem.


Well, as you pointed out, in order to use the quotient rule in the way you do it depends on other theorems. You might be assuming the same things, just unaware of it.

Originally posted by Reverend Jones:
Is the version you gave constructive in some way? Connected with a larger theory that explains how to solve a problem connected with the theorem? Or is it just that Taylor series are more convenient? Just curious.


Well, for one, it does appear to be constructive, not that I care so much.

It's more that, analyticity is a really important concept, and is incredibly useful, so understanding why series expansions work and how they relate and explain basic properties is more important than just understanding the properties themselves.

Even just past e^x as an eigenfunction, power series expansions are absolutely used to solve higher order differential equations, and crop up in all sorts of places.
2018-01-24, 7:21 PM #146
Originally posted by Reverend Jones:
You're using the royal we like my brain is a math textbook you are writing. I'm not even a mathematician.


Yeah, I'm just trying to make the distinction between the analytic sense of most "measure 1" versus common language most, the latter is how I was using it
2018-01-24, 7:31 PM #147
Originally posted by Reid:
Well, as you pointed out, in order to use the quotient rule in the way you do it depends on other theorems. You might be assuming the same things, just unaware of it.


You don't need analyticity to use the mean value theorem.

Originally posted by JLLagrange:
Right, but isn't this still (a version of) the Picard-Lindelof theorem that another commenter brought up?





Originally posted by dm287:
Well it is a special case of it, but it definitely doesn't NEED such a high-powered theorem to prove. It can be done by the Mean Value Theorem as follows:

Assume that f is such that it is differentiable everywhere with derivative 0. Then by MVT:

[f(a)-f(b)]/(b-a) = f'(c) = 0 for any two points a, b. Thus, f(a) = f(b) so f is constant. Thus only constants have derivative 0.

The general case immediately follows:

Assume f and g are such that f' = g'. Then:

f' - g' = 0 (f-g)' = 0 By what we just proved, f - g = C f = g + C


https://www.reddit.com/r/math/comments/r3w8r/any_other_functions_than_ex_and_0_that_are/c42zyii/
2018-01-24, 7:37 PM #148
Originally posted by Reverend Jones:


Oh, cool. That's a better theorem then.
2018-01-24, 7:55 PM #149
Originally posted by Reid:
It's more that, analyticity is a really important concept, and is incredibly useful, so understanding why series expansions work and how they relate and explain basic properties is more important than just understanding the properties themselves.

Even just past e^x as an eigenfunction, power series expansions are absolutely used to solve higher order differential equations, and crop up in all sorts of places.


I am totally on board with analyticity being a powerful tool. In fact the property goes beyond differential equations, and has discrete analogies.

http://web.cs.elte.hu/~lovasz/analytic.pdf

Edit: Actually, this is a little off-topic, since this is really talking about discrete analogs of analytic continuations, which is about analyticity in the setting of complex analysis, rather than the general discussion of analyticity in this thread.
2018-01-24, 8:21 PM #150
Originally posted by Reid:
also makes sense why it's basically the way to solve differential equations nontrivially,


Are you talking about the contraction mapping theorem, which gives a constructive theorem for the solution of some ODE's?

But do we really care about analyticity, or Lipsitzch continuity? Analytic => smooth => continuously differentiable => (locally) Lipsitzch continuous. Analyticity is a really powerful hypothesis, but the converse is false as far as I know.

Incidentally, there is the interesting question of what about analyticity is "natural". A bunch of things in physics seem to work well assuming it, and like I said, physicists love their Taylor series. However, this physicist makes the case that in fact analyticity isn't completely universal in physics.
2018-01-31, 12:24 AM #151
http://wry.me/hacking/Turing-Drawings

Quote:
After watching a documentary about Alan Turing, I started thinking about Turing machines again. They’re a very useful tool for computer science proofs and an elegantly simple computational model. Beyond this, however, Turing machines seem relatively useless in terms of “real-world” applications. Any computable function can be computed by a Turing machine, but you probably wouldn’t want to use one as a compilation target: it would be too inefficient.

One possible application for Turing machines is to use them as programs to generate data procedurally. Their simplicity makes it possible, for example, to create working programs randomly. Generating a random stream of x86 instructions that doesn’t crash could be tricky, but with a Turing machine, it’s quite easy. I decided to try something like this to produce so-called generative art. The program I wrote generates random Turing machines that operate on a two-dimensional grid instead of a one-dimensional tape. The symbols written on the grid can be interpreted as colors, and voilĂ : we have procedural drawings.
2018-01-31, 12:30 AM #152
Interesting to see that somebody got it to do this.

http://www.wry.me/hacking/Turing-Drawings/#3,6,2,2,3,2,4,0,0,1,0,2,1,2,1,1,0,1,2,3,2,3,0,2,1,0,2,5,3,2,5,2,2,4,1,1,5,0,2,4,3,0,4,0,0,1,1,2,1,3,2,1,0,2,2,0

Looks like sand.
2018-01-31, 12:36 AM #153
I found this one, kinda neat:

http://www.wry.me/hacking/Turing-Drawings/#4,3,2,1,3,2,2,2,1,1,3,2,1,2,0,2,1,2,1,1,0,1,2,3,2,0,2,2,0,0,2,1,3,1,2,1,2,3
2018-01-31, 12:45 AM #154
"City erected and then eroded":

http://www.wry.me/hacking/Turing-Drawings/#4,3,2,1,3,0,2,2,1,1,0,2,2,3,2,1,0,3,1,0,2,2,1,2,1,2,2,1,3,0,1,0,3,2,1,0,2,0
2018-01-31, 12:57 AM #155
A wild fractal appears!

http://www.wry.me/hacking/Turing-Drawings/#4,3,3,1,1,1,2,0,1,1,1,2,1,1,2,2,2,3,2,1,0,2,3,0,1,3,3,1,2,2,2,0,0,2,3,3,2,2

(You can slow it way down for these ones where they happen really fast)
2018-01-31, 1:00 AM #156
There should be a button that lets you fast forward to the end
2018-01-31, 1:09 AM #157
Yeah the ergonomics of the UI leave something to be desired.

There are a lot of duds. What would be nice would be to generate like 1000 or so as GIF's and then just view them all as thumbnails as a giant document, and scroll all at once to find the good ones.
2018-01-31, 1:12 AM #158
Related
2018-01-31, 1:22 AM #159
So I didn't discover these two, but found them to be attractive:

laser rain


bubbles
2018-01-31, 1:25 AM #160
Looks like the author also used Turing machines to generate music. Actually it doesn't sound half bad.
1234567891011121314151617181920212223242526272829303132333435

↑ Up to the top!