Page 4 of 35 FirstFirst ... 2345614 ... LastLast
Results 121 to 160 of 1367

Thread: Computer Science and Math and Stuff

  1. #121
    Quote Originally Posted by Reverend Jones View Post
    but which still require testing to verify
    Unless (emphasis added) your name is Albert Einstein!

    Today, we do experiments in our heads. The University of Houston's College of Engineering presents this series about the machines that make our civilization run, and the people whose ingenuity created them.

    The late 19th-century German philosopher Ernst Mach added a phrase to our language. It was "thought experiment." It's a wonderfully contradictory idea.

    One scientist observes and measures nature. Another writes mathematical theories. Experiment and thought seem to be separate roads to learning. Mach walked the experimenter's road. But he said we can also experiment in our heads, not just in nature.

    He said, look what Galileo did. Aristotle thought heavy things should fall faster than light ones. He thought a stone should fall faster than a feather. So Galileo said, "Let's glue a feather to a stone and drop it. Together, they weigh more than either the feather or the stone alone. If Aristotle's right, they'll fall faster together than either would fall alone. But who could believe the feather wouldn't slow the stone? Aristotle has to be wrong."

    That's what we call a thought experiment. Galileo did the experiment in his head. Mach made a funny argument to justify doing that. He said our intuition is accurate because evolution has shaped it. We shrink from the absurd. That's what Galileo was doing when he said a feather can't speed the fall of a stone.

    Enter now the grand master of thought experiments, Albert Einstein. Einstein was Mach's disciple. He created a great theater of thought experiments to explain relativity. But relativity flies in the teeth of our intuition. He talked about throwing balls and reading clocks on railroad trains and elevators. His conclusions had nothing to do with intuition.

    Mach finally cracked under the weight of Einstein's admiration. Relativity theory was, he wrote, "growing more and more dogmatic." For Mach, a theory was only a framework for our measurements. Einstein figured that you first wrote correct theories. Then the data would simply fall into place.

    In 1919 Einstein shrugged off an important telegram. Someone wired that they'd seen rays of light bend under the influence of gravity. That's what relativity predicts. Einstein was unimpressed. If light hadn't bent, he said, then "I'd have been sorry for the dear Lord. The theory is correct!"

    So Einstein took Mach's thought experiments and used them in ways Mach never intended. He used them to show us a world that's far more subtle and surprising than anything our intuition ever prepared us for. Mach's ideas about thought experiments finally faded because the world is much stranger than our capacity to imagine it.
    https://www.uh.edu/engines/epi576.htm

  2. #122
    ^^vv<><>BASTART
    Posts
    8,967
    TAing calculus the 2nd time around feels so much more smooth.

  3. #123
    Zulenglashernbracker
    Posts
    5,883
    TAing circuits the fourth time around feels like ****
    I had a blog. It sucked.

  4. #124
    ^^vv<><>BASTART
    Posts
    8,967
    Won't be me! Next year I start instructing!

    Woo!

  5. #125
    ^^vv<><>BASTART
    Posts
    8,967
    so it turns out, there's actually a pretty easy proof that if f=f' and f is analytic, then f=ce^x for some constant c

    i always suspected that f=f' meant either f=0 or f=ce^x, but i never saw such an elegant proof of uniqueness

    also makes sense why it's basically the way to solve differential equations nontrivially, as having an eigenfunction seems pretty durn important



    probably isn't special to people who are smart
    Last edited by Reid; 01-24-2018 at 04:10 PM.

  6. #126
    ^^vv<><>BASTART
    Posts
    8,967
    i'm not certain though about the truth of this statement outside of analytic functions, nor about the analyticity of solutions to differential equations

  7. #127
    ^^vv<><>BASTART
    Posts
    8,967
    On second thought, it's not particularly surprising after realizing how Taylor expansion works. Don't care, insights are insights.

  8. #128
    Quote Originally Posted by Reid View Post
    so it turns out, there's actually a pretty easy proof that if f=f' and f is analytic, then f=ce^x for some constant c

    i always suspected that f=f' meant either f=0 or f=ce^x, but i never saw such an elegant proof of uniqueness

    also makes sense why it's basically the way to solve differential equations nontrivially, as having an eigenfunction seems pretty durn important



    probably isn't special to people who are smart
    https://www.reddit.com/r/math/commen...nd_0_that_are/

  9. #129
    A First Course in Calculus, Serge Lang

    Click image for larger version. 

Name:	lang.jpg 
Views:	23 
Size:	64.1 KB 
ID:	27806

    https://books.google.com/books?id=vy...page&q&f=false

  10. #130
    Quote Originally Posted by Reid View Post
    i'm not certain though about the truth of this statement outside of analytic functions, nor about the analyticity of solutions to differential equations
    Quote Originally Posted by Reid View Post
    On second thought, it's not particularly surprising after realizing how Taylor expansion works. Don't care, insights are insights.
    Quotient rule.

  11. #131
    ^^vv<><>BASTART
    Posts
    8,967
    Quote Originally Posted by Reverend Jones View Post
    Quotient rule.
    This is the exact sort of uninsightful Bourbakian proof you were deriding a few days ago, though. Neat, but uninsightful.

  12. #132
    It's called 'calculus' for a reason.

    Newton used it in secret but recast the Principia in the language of geometry in order to avoid criticism.

  13. #133
    Also if you read that Reddit page, apparently the use of the quotient rule in this way depends on a much more general and snobby theorem.

  14. #134
    https://www.reddit.com/r/math/commen...t_are/c42w3ui/

    http://en.wikipedia.org/wiki/Picard%...C3%B6f_theorem

    ...but as dm287 wrote, you can just use the mean value theorem. Which iirc is also the key theorem justifying Taylor's Theorem, which seems closer to your approach. I do admit that using Taylor's Theorem feels quite majestic (physicists love it).
    Last edited by Reverend Jones; 01-24-2018 at 08:28 PM.

  15. #135
    Quote Originally Posted by Reid View Post
    This is the exact sort of uninsightful Bourbakian proof you were deriding a few days ago, though. Neat, but uninsightful.
    Also this criticism is really unimpressive to me. Lame. You don't like the simpler and easier to understand route?

  16. #136
    Anyway, your idea is very nice too. But analyticity is quite a strong assumption, isn't it?

  17. #137
    ^^vv<><>BASTART
    Posts
    8,967
    Quote Originally Posted by Reverend Jones View Post
    Also this criticism is really unimpressive to me. Lame. You don't like the simpler and easier to understand route?
    Easier to follow, sure. But it tells you nothing about why that's the case.

  18. #138
    ^^vv<><>BASTART
    Posts
    8,967
    Quote Originally Posted by Reverend Jones View Post
    Anyway, your idea is very nice too. But analyticity is quite a strong assumption, isn't it?
    Most functions we deal with that are smooth are analytic, smooth but nowhere analytic functions exist but are pretty much only used as counter-examples.

  19. #139
    Most functions aren't even continuous.

  20. #140
    ^^vv<><>BASTART
    Posts
    8,967
    Quote Originally Posted by Reverend Jones View Post
    Most functions aren't even continuous.
    Yup, most functions that we actively deal with aren't though!

  21. #141
    Quote Originally Posted by Reid View Post
    Easier to follow, sure. But it tells you nothing about why that's the case.
    If you have an extra hypothesis, it's not even the same theorem.

  22. #142
    Quote Originally Posted by Reid View Post
    Yup, most functions that we actively deal with aren't though!
    You're using the royal we like my brain is a math textbook you are writing. I'm not even a mathematician.

  23. #143
    Quote Originally Posted by Reid View Post
    Easier to follow, sure. But it tells you nothing about why that's the case.
    Is the version you gave constructive in some way? Connected with a larger theory that explains how to solve a problem connected with the theorem? Or is it just that Taylor series are more convenient? Just curious.

  24. #144
    also makes sense why it's basically the way to solve differential equations nontrivially,
    Okay, I guess this is why you were interested.

  25. #145
    ^^vv<><>BASTART
    Posts
    8,967
    Quote Originally Posted by Reverend Jones View Post
    If you have an extra hypothesis, it's not even the same theorem.
    Well, as you pointed out, in order to use the quotient rule in the way you do it depends on other theorems. You might be assuming the same things, just unaware of it.

    Quote Originally Posted by Reverend Jones View Post
    Is the version you gave constructive in some way? Connected with a larger theory that explains how to solve a problem connected with the theorem? Or is it just that Taylor series are more convenient? Just curious.
    Well, for one, it does appear to be constructive, not that I care so much.

    It's more that, analyticity is a really important concept, and is incredibly useful, so understanding why series expansions work and how they relate and explain basic properties is more important than just understanding the properties themselves.

    Even just past e^x as an eigenfunction, power series expansions are absolutely used to solve higher order differential equations, and crop up in all sorts of places.

  26. #146
    ^^vv<><>BASTART
    Posts
    8,967
    Quote Originally Posted by Reverend Jones View Post
    You're using the royal we like my brain is a math textbook you are writing. I'm not even a mathematician.
    Yeah, I'm just trying to make the distinction between the analytic sense of most "measure 1" versus common language most, the latter is how I was using it

  27. #147
    Quote Originally Posted by Reid View Post
    Well, as you pointed out, in order to use the quotient rule in the way you do it depends on other theorems. You might be assuming the same things, just unaware of it.
    You don't need analyticity to use the mean value theorem.

    Quote Originally Posted by JLLagrange
    Right, but isn't this still (a version of) the Picard-Lindelof theorem that another commenter brought up?


    Quote Originally Posted by dm287
    Well it is a special case of it, but it definitely doesn't NEED such a high-powered theorem to prove. It can be done by the Mean Value Theorem as follows:

    Assume that f is such that it is differentiable everywhere with derivative 0. Then by MVT:

    [f(a)-f(b)]/(b-a) = f'(c) = 0 for any two points a, b. Thus, f(a) = f(b) so f is constant. Thus only constants have derivative 0.

    The general case immediately follows:

    Assume f and g are such that f' = g'. Then:

    f' - g' = 0 (f-g)' = 0 By what we just proved, f - g = C f = g + C
    https://www.reddit.com/r/math/commen...t_are/c42zyii/

  28. #148
    ^^vv<><>BASTART
    Posts
    8,967
    Quote Originally Posted by Reverend Jones View Post
    You don't need analyticity to use the mean value theorem.







    https://www.reddit.com/r/math/commen...t_are/c42zyii/
    Oh, cool. That's a better theorem then.

  29. #149
    Quote Originally Posted by Reid View Post
    It's more that, analyticity is a really important concept, and is incredibly useful, so understanding why series expansions work and how they relate and explain basic properties is more important than just understanding the properties themselves.

    Even just past e^x as an eigenfunction, power series expansions are absolutely used to solve higher order differential equations, and crop up in all sorts of places.
    I am totally on board with analyticity being a powerful tool. In fact the property goes beyond differential equations, and has discrete analogies.

    http://web.cs.elte.hu/~lovasz/analytic.pdf

    Edit: Actually, this is a little off-topic, since this is really talking about discrete analogs of analytic continuations, which is about analyticity in the setting of complex analysis, rather than the general discussion of analyticity in this thread.
    Last edited by Reverend Jones; 01-24-2018 at 11:38 PM.

  30. #150
    Quote Originally Posted by Reid
    also makes sense why it's basically the way to solve differential equations nontrivially,
    Are you talking about the contraction mapping theorem, which gives a constructive theorem for the solution of some ODE's?

    But do we really care about analyticity, or Lipsitzch continuity? Analytic => smooth => continuously differentiable => (locally) Lipsitzch continuous. Analyticity is a really powerful hypothesis, but the converse is false as far as I know.

    Incidentally, there is the interesting question of what about analyticity is "natural". A bunch of things in physics seem to work well assuming it, and like I said, physicists love their Taylor series. However, this physicist makes the case that in fact analyticity isn't completely universal in physics.
    Last edited by Reverend Jones; 01-25-2018 at 12:12 AM.

  31. #151
    http://wry.me/hacking/Turing-Drawings

    After watching a documentary about Alan Turing, I started thinking about Turing machines again. They’re a very useful tool for computer science proofs and an elegantly simple computational model. Beyond this, however, Turing machines seem relatively useless in terms of “real-world” applications. Any computable function can be computed by a Turing machine, but you probably wouldn’t want to use one as a compilation target: it would be too inefficient.

    One possible application for Turing machines is to use them as programs to generate data procedurally. Their simplicity makes it possible, for example, to create working programs randomly. Generating a random stream of x86 instructions that doesn’t crash could be tricky, but with a Turing machine, it’s quite easy. I decided to try something like this to produce so-called generative art. The program I wrote generates random Turing machines that operate on a two-dimensional grid instead of a one-dimensional tape. The symbols written on the grid can be interpreted as colors, and voilà: we have procedural drawings.

  32. #152
    Interesting to see that somebody got it to do this.

    http://www.wry.me/hacking/Turing-Dra...,3,2,1,0,2,2,0

    Looks like sand.

  33. #153

  34. #154

  35. #155
    A wild fractal appears!

    http://www.wry.me/hacking/Turing-Dra...,0,0,2,3,3,2,2

    (You can slow it way down for these ones where they happen really fast)

  36. #156
    Admiral of Awesome
    Posts
    18,388
    There should be a button that lets you fast forward to the end

  37. #157
    Yeah the ergonomics of the UI leave something to be desired.

    There are a lot of duds. What would be nice would be to generate like 1000 or so as GIF's and then just view them all as thumbnails as a giant document, and scroll all at once to find the good ones.

  38. #158

  39. #159
    So I didn't discover these two, but found them to be attractive:

    laser rain


    bubbles

  40. #160
    Looks like the author also used Turing machines to generate music. Actually it doesn't sound half bad.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •