Math in the Media

Also see the Blog on Math Blogs


Image of the month image of the month

"Folded books illustrate the beauty of Longest Crease, Parabola and Spirals without destroying the books but re-purposing them as sculptures. In this folded book, "Quotable Woman," a spiral of Archimedes is formed along the top edge and a cylindrical helix along the front. Furthermore, there's an internal conical helix (not visible) produced by points to which the active corners of the pages are folded." -- Sharol Nau. See images of more inspirational works in the 2019 Mathematical Art Exhibition on AMS Mathematical Imagery.

Tony Phillips

Tony Phillips' Take on Math in the Media
A monthly survey of math news

This month's topics:

Mitchell Feigenbaum obituary in the New York Times

"Mitchell J. Feigenbaum, a pioneer in the field of mathematical physics known as chaos, died on June 30 in Manhattan. He was 74." So starts Feigenbaum's obituary in the Times, July 18, 2019. The reporter is Kenneth Chang, who surveys Feigenbaum's career (at his death he was the Toyota Professor and director of the Center for Studies in Physics and Biology at Rockefeller University) and focuses on his most important mathematical discovery, the numerical constant that now bears his name. "Dr. Feigenbaum's lifestyle and his Renaissance intellect were a poor fit to the demands of modern publish-or-perish academia. But by following his own path, he uncovered a pattern of chaos that is universal in math and in nature."

  • "At Los Alamos National Laboratory in New Mexico in the mid-1970s, Dr. Feigenbaum, using a programmable calculator, found what seemed at first a mathematical curiosity. A simple equation generated a sequence of numbers, which were initially trivial: the same number over and over. But as a parameter in the equation shifted, the output became more varied. First the numbers bounced back and forth between two values, then they cycled among four values, then eight, and so on, with the rate of the change quickening until the patterns lost all hint of repeating cycles. The dynamics had, in the terminology of physics, passed into the realm of deterministic chaos. That is, each number of the sequence could be computed precisely, but the resulting pattern appeared to be complex and random."
  • "Dr. Feigenbaum looked at another simple equation, and it exhibited the same behavior, known as period doubling. More startling, the number that characterized the rate of doubling was the same: As the periods multiplied, each doubling occurred about 4.669 times as quickly as the previous one. This number is now known as the Feigenbaum constant. Dr. Feigenbaum was able to prove why it is a universal mathematical value, much as pi —the ratio of the circumference of a circle to its diameter— is the same for all circles."
  • "In 1979, a French scientist, Albert J. Libchaber, observed the same cascade of period doublings in the temperature fluctuations in the center of a convecting fluid. Dr. Feigenbaum's theory of the transition from order to chaos now described phenomena in the real world."

Chang quotes Kenneth Brecher (Astronomy, Boston University): "There aren't too many fundamental constants, and he was the only living person that had one."

Mathematics and the Visual Cortex

"High-dimensional geometry of population responses in visual cortex" ran in Nature, June 26, 2019. It is the product of a 5-person collaboration between the HHMI Janelia Laboratory (Ashburn, VA) and University College, London, led by Carsen Stringer and Marius Pachitariu. They used "resonance-scanning two-photon calcium microscopy, using 11 imaging planes spaced at 35 μm" to record the simultaneous responses of 12,578 neurons in about 1/3 cubic millimeter of the visual cortex of a mouse, while the mouse was being presented with a number of images (from 32 to 2800 according to the experiment).

2 sets of responses

"Mean responses (trial-averaged) of 65 randomly chosen neurons to 32 image stimuli." The color represents the variance from the mean, measured in standard deviations (scale bar on the right). "Stimuli were presented twice in the same order." As is clear from the two displays, most (in fact about 80%) of the neurons showed high correlation between repeats. Image from Nature 571 361-365, used with permission.

The read-out from the targeted population of neurons can be considered as an encoding of the visual field. The authors investigated this "population code for visual stimuli" by a method they call "cross-validated principal component analysis."

  • Principal Component Analysis (PCA) is very important application of elementary linear algebra. In this case, each one of the $p$ stimuli encodes as a point in $N$-dimensional space, where $N$ is the number of neurons sampled, and the $k$-th coordinate is the variance of the response to this stimulus by the $k$-th neuron. The result of PCA is a new set of coordinates for $N$-dimensional space. The first coordinate is in the direction ${\vec e}_1$ along which the $p$ points are maximally spread out (if there isn't one, then that set of points is not amenable to PCA). This will be the single coordinate that best distinguishes between the points. With that settled, project everything onto the $(N-1)$-dimension sub-space perpendicular to ${\vec e}_1$.  In that sub-space locate ${\vec e}_2$ to maximize spread, as before. The coordinate in that direction is the one that next best distinguishes between the points. Continue (project onto the subspace spanned by ${\vec e}_1$ and ${\vec e}_2$, etc.). At the end, a point ${\vec x}$ will be represented as $x_1{\vec e}_1 + x_2{\vec e}_2 + \cdots$. The coordinates $x_1, x_2, \dots$ are its principal components.
  • For example, if all the points lie along a line in $N$-dimensional space, then $x_2, x_3, \dots$ will all be zero, and $x_1$ will locate points along that line. More generally, using only $x_1, \dots, x_n$ will give the best $n$-dimensional approximation to the distribution of those points in $N$-dimensional space.

The way the quality of that approximation increases with $n$ is an intrinsic geometric property of the original set of points. The authors report: "This method revealed that the visual population responses did not lie on any low-dimensional plane within the space of possible firing patterns. The amount of variance explained continued to increase as further dimensions were included [see next image], without saturating at any dimensionality below the maximum possible. As a control analysis, we applied cvPCA to the neural responses obtained when only 32 images were shown many times —the reliable component of these responses must, by definition, lie in a 32-dimensional subspace— and as expected we observed a saturation of the variance after 32 dimensions."

cumulative variance

"Cumulative fraction of variance in planes of increasing dimension, for an ensemble of 2,800 stimuli (blue) and for 96 repeats of 32 stimuli (green). The dashed line indicates 32 dimensions." Image from Nature 571 361-365, used with permission.

The authors made another, unexpected observation: "the fraction of neural variance in planes of successively larger dimensions followed a power law. ... [T]he variance of the $n$th principal component had a magnitude that was approximately proportional to $1/n$."

 

cumulative variance

A log-log plot of the magnitude of the variance of the $n$th principal component plotted as a function of $n$ (this is an example of an eigenspectrum). The black line shows the linear fit of $1/n^{\alpha}$, $\alpha = 1.04$.

Qualitatively, as the authors remark, "this reflects successively less variance in dimensions that encode finer stimulus features." But there is a subtler, mathematical reason for this phenomenon, which they were able to tease out. "Power-law eigenspectra are observed in many scientific domains, and are related to the smoothness of the underlying functions. For example, if a function of one variable is differentiable, its Fourier spectrum must decay asymptotically faster than a power law of exponent 1 [i.e. strictly faster than $1/n$; online reference from TUMünchen]. ... We therefore theorized that the variance power law might be related to smoothness of the neural responses. We showed mathematically that if the sensory stimuli presented can be characterized by $d$ parameters, and if the mapping from these parameters to (noise-free) neural population responses is differentiable, then the population eigenspectrum must decay asymptotically faster than a power law of exponent $\alpha = 1 + 2/d$. Conversely, if the eigenspectrum decays slower than this, a smooth neural code is impossible: its derivative tends to infinity with the number of neural dimensions, and the neural responses must lie on a fractal rather than a differentiable manifold." More simply: "If the eigenspectrum were to decay slower than $n^{-1-2/d}$ then the neural code would emphasize fine stimulus features so strongly that it could not be differentiable." This requires quite a bit of analysis (see their Supplementary material, 2).

The authors conclude: "Neural representations with close-to-critical power-law eigenspectra may provide the brain with codes that are as efficient and flexible as possible while still allowing robust generalization."

Much ado about PEMDAS

PEMDAS (mnemonic: Please Excuse My Dear Aunt Sally) is taught to children in certain schools so they can decipher mathematical expressions involving several operations, where the answer may depend on the order in which they are performed. Precedence is taken first by Parentheses, then by Exponentiation, then by Multiplication and Division (equal precedence), and finally Addition and Subtraction (equal). Operations of equal precedence are to executed in left-to-right order.

 

  • The current flap over PEMDAS originated on Twitter and seems first to have been reported in Popular Mechanics (July 31, 2019). Andrew Daniels wrote "This Simple Math Problem Drove Our Entire Staff Insane. Can You Solve It?" He posts the tweet $8\div 2(2+2)=?$ and comments " ...this maddening math problem has gone viral, following in the grand tradition of such traumatic events as The Dress and Yanny/Laurel. These kinds of conundrums are purposely meant to divide and conquer, and predictably, the seemingly simple problem posed in the offending tweet — $8\div 2(2+2)$ — practically caused a civil war in the Popular Mechanics office ... " He goes on to document the way the staff wasted their time that day. Towards the end he called the AMS, and posts "A Brief Statement from Mike Breen, the Public Awareness Officer for the American Mathematical Society, Whose Job Is to 'Try to Tell People How Great Math Is,'" where Mike explains how, by the rules, the answer is 16. "But the way it's written, it's ambiguous. ... I wouldn't hit someone on the wrist with a ruler if they said 1."
  • Daniels' posting was picked up the same day by Frank Miles of the Fox News Network. "Viral math problem baffles many on Internet: Can you solve $8\div 2(2+2)$? The equation went online this week on Twitter causing major confusion over the right answer."
  • By August 2, the commotion had reached the New York Times. Steven Strogatz takes up the question and carefully explains how the rules mandate the answer 16. But he is somewhat apologetic about it: "Now realize, following Aunt Sally is purely a matter of convention. In that sense, PEMDAS is arbitrary. Furthermore, in my experience as a mathematician, expressions like $8\div 2\times4$ look absurdly contrived. No professional mathematician would ever write something so obviously ambiguous. We would insert parentheses to indicate our meaning and to signal whether the division should be carried out first, or the multiplication."
  • Many readers disagreed with Strogatz, who came back to the question in the Times on August 5. "After reading through the many comments on the article, I realized most of these respondents were using a different (and more sophisticated) convention than the elementary PEMDAS convention I had described in the article. In this more sophisticated convention, which is often used in algebra, implicit multiplication is given higher priority than explicit multiplication or explicit division, in which those operations are written explicitly with symbols like $\times * /$ or $ \div$. Under this more sophisticated convention, the implicit multiplication in $2(2 + 2)$ is given higher priority than the explicit division in $8\div 2(2 + 2)$. In other words, $2(2+2)$ should be evaluated first. Doing so yields $8\div2(2 + 2) = 8\div8 = 1.$ His analysis was summarized in the subtitle: "The confusion (likely intentional) boiled down to a discrepancy between the math rules used in grade school and in high school."
  • On August 6 (viral, or what?) another piece in the New York Times. Kenneth Chang contributes "Why Mathematicians Hate That Viral Equation". [Image of two lemurs with an abacus]. "It's formatted to confuse people, and there are no interesting underlying concepts."
  • One last salvo from Kenneth Chang, in the Times on August 21: How Many Triangles Are There? Here's How to Solve the Puzzle, where he explains how to solve a puzzle he proposed on the 6th; this one does have "interesting underlying concepts."
  • Re PEMDAS: an earlier and cleverer viral tweet, reported in the Hindustani Times on July 16, involved the equation $230-220\times 0.5 =?$ And "You probably won't believe it but the answer is 5!"

 

Tony Phillips
Stony Brook University
tony at math.sunysb.edu

Math Digest
FC Review Archive

Archive of Reviews: Books, plays and films about mathematics

Citations for reviews of books, plays, movies and television shows that are related to mathematics (but are not aimed solely at the professional mathematician). The alphabetical list includes links to the sources of reviews posted online, and covers reviews published in magazines, science journals and newspapers since 1996

More . . .