Publications Meetings The Profession Membership Programs Math Samplings Policy & Advocacy In the News About the AMS
Blog on Math Blogs
Mail to a friend · Print this article · Previous Columns 
Tony PhillipsTony Phillips' Take on Math in the Media
A monthly survey of math news

This month's topics:

Order from chaos in Monterey Bay

currents on 991224 currents on 991225

High-frequency radar measurements of surface currents in Monterey Bay at 1900 GMT on December 24 and 25, 1999. Tidal effects have been subtracted. These are parts of stills from an animation available on the Department of Oceanography, Naval Postgraduate School website and used with permission.

The New York Times Science section for Tuesday September 28, 2009 featured a report by Bina Venkataraman on recent progress in analyzing complex fluid flows, and in particular the "seemingly jumbled currents" in Monterey Bay. "Assisted by instruments that can track in fine detail how parcels of fluid move, and by low-cost computers that can crunch vast amounts of data quickly, researchers have found hidden structures beyond Monterey Bay, structures that explain why aircraft meet unexpected turbulence, why the air flow around a car causes drag and how blood pumps from the heart's ventricles." The "hidden structure" is a Lagrangian coherent structure (LCS), which can be extracted from the record of the surface flow vectorfield v(x,y,t) over time. Very briefly (see Shawn Shadden's LCS Tutorial) for each time t a finite-time Lyapunov analysis constructs a function σ(x,y,t): the maximum stretching between points nearby to (x,y) as they move following the vectorfield v starting at time t. For a fixed t, the Lagrangian coherent structure associated to v is the system of ridges of σ. Shadden explains ridges in terms of walking on the graph of σ: "Intuitively, a ridge is a curve such that if somebody [were] walking along a ridge, then stepping in the direction transverse to the ridge meant that they would be stepping down, and additionally the topography would drop off most steeply in that direction." Under suitable conditions, these ridges are time-dependent invariant manifolds: the ridges move with time, but particles on a ridge flow along that ridge, and flux across a ridge is very small (on the order of experimental error in the examples discussed). The consequence is that a ridge acts as a moving separatrix: points that start on one side stay on that side. For Monterey Bay, particles that start outside the LCS (black curve in the image below) will eventually be swept out to sea, while particles inside will will remain in the bay much longer. As Venkataraman remarks, there are applications to controlling pollution, tracking the spread of oil spills, and even improving search-and-rescue operations for people lost at sea. (Nice animation of an elementary example on page 7 of the LCS Tutorial; recommended survey lecture by Jerrold Marsden on the MSRI website.)

LCS
in Monterey Bay

Surface current vectorfield, the function σ color-coded from dark orange (max) to dark blue (min), and the Lagrangian Coherent Structure (black) in Monterey Bay, 0900 GMT on August 7, 2000. Image courtesy of François Lekien (Université Libre de Bruxelles) and Chad Coulliette (Caltech).

Math Ed in the City Journal

The Manhattan Institute is a "conservative, market-oriented think tank" (according to Wikipedia) based on Vanderbilt Avenue in New York City. It publishes City Journal, a quaterly magazine of Urban Affairs. The Autumn 2009 issue contains "Who Needs Mathematicians for Math, Anyway?" by Sandra Stotsky; Stotsky served on the National Mathematics Advisory Panel, created in 2006 by Executive Order; their report was submitted to the President and the Secretary of Education on March 13, 2008. Stotsky's article, deliriously illustrated by Arnold Roth, chronicles the history of this report and of its uniformly negative reception from the Mathematics Education establishment.

Prof. Stotsky starts back in 1989, with the NCTM Standards. [Unfortunately the Standards and the 2000 Principles and Standards are available online to NCTM members only]. Her primary criticism is that, as an outgrowth of progressive and politically-correct thought during the 1970s and 1980s, the Standards had "underlying goals--never made clear to the general public--[which] were social, not academic." She cites Alan Schoenfeld, prominent among the authors of the high school standards in the report: "the traditional curriculum was a vehicle for . . . the perpetuation of privilege." It is clear that the battle lines were already being drawn, or at least perceived, as much on political/class lines as on mathematical ones. The National Panel was "composed of mathematicians, cognitive psychologists, mathematics educators, and education researchers" but it was chartered by a conservative president. It discovered (not surprisingly) that its recommendations could be vigorously rejected as an initiative of the "Bush Administration and US financial/corporate elites" wanting to bolster "capital's efforts to shore up the US's weakening economic global position" and not to benefit "the majority of the US people--particularly marginalized and excluded students of color and low-income students." (these last quotes from Eric Gutstein, a mathematics educator at UIC). So it is also not surprising that when "The panel found little if any credible evidence supporting the teaching philosophy and practices that math educators have promoted in their ed-school courses and embedded in textbooks for almost two decades" the answer came back that the report was based on "a strict and narrow definition of 'scientific evidence' and an almost exclusive endorsement of quantitative methods at the expense of qualitative approaches." (Anthony Kelly, Professor of Education, GMU).

Stotsky is not optimistic about the near future. The organizations in charge of drawing up national standards for K-12, the National Governors Association and the Council of Chief State School Officers (supported by DoE and NEA), "have not yet invited a single mathematical or science society" to advise them. "And even if a new Congress or Secretary of Education were to support the panel's recommendations, it will be essentially business as usual in the public schools so long as math educators, joined by assessment experts and technology salesmen, continue to shape the curriculum."

Homomorphic encryption

 

Craig Gentry submitted his thesis "A Fully Homomorphic Encryption Scheme" to the Stanford Computer Science Department in September 2009. The talk he gave on the topic at STOC 2009 (The 41st annual ACM symposium on Theory of computing) back in June does not seem to have penetrated the mainstream media at the time, but raised a considerable stir in online newsletters and lists associated to the world of data security: a very useful "popular explanation" by Hal Finney on the Cryptography mailing list (June 16), along with reports in Voltage Superconductor (June 24), Forbes.com (June 24), eWeek.com (June 25) and Computerworld (June 25). There is also a useful report on the IBM website. The story finally made it to BusinessWeek in a posting ("IBM's Encryption Breakthrough for the Web") by Stephen Baker on September 30.

The commonly used RSA code is only homomorphic with respect to multiplication. If (N, k) is your public key, you encode x as xk mod N. Since xk yk = (xy)k, and since (a mod N)(b mod N) = (ab mod N), it follows that the code for xy is the product of the codes for x and y. So you could correctly multiply two encrypted numbers without ever knowing what the numbers were. If an encryption is fully homomorphic, you can send your tax accountant an encrypted copy of all your financial data, and get back an encrypted copy of the amount you owe, without revealing anything about your actual income and expenses. (This example from Andy Green's article on Forbes.com).

The trouble with the fully homomorphic encryption schemes known before Gentry is that after a small number of operations they lose accuracy and cannot be reliably decrypted. Gentry managed to devise an initial encryption (a lattice encryption, not an RSA code; instead of factorization it uses a different hard problem) that was homomorphic enough to implement its own decrypting algorithm, plus a little extra. Hal Finney: "I have to go back to Gödel's and Turing's work to think of a comparable example exploiting the power of self-embedding."

Once you have a "homomorphic enough" encryption algorithm E, Gentry explains how to homomorphically implement a function f of arbitrary complexity; this is illustrated in the "bootstrapping" diagram below. You choose enough different (public key, private key) pairs so that the "little extra"s add up to enough for your job. You encrypt the nth private key skn using the (n + 1)st public key pkn+1, and send them all along with your information I, which you have encrypted using pk1. Suppose pk1 has exhausted its homomorphic potential, and has implemented the first steps of f ; call these f1. Your accountant over-encrypts f1(I) (already pk1-encrypted) using pk2, and uses the homomorphic property of E, along with the pk2-encoded copy of sk1 which you sent along, to undo the pk1 encryption "inside" pk2. The next "little extra" can used to homomorphically implement the next part of your function; call this f2. And so on.

How soon can we buy it? Brian Prince at eWeek.com quotes Gentry: "Before it becomes a tool, more theoretical work will need to be done to make it more efficient. But now that researchers know that fully homomorphic encryption is possible and have an actual construction that they can sink their teeth into, I think there is reason to be optimistic that the efficiency will be improved dramatically." Gentry's May 14 Fields Institute lecture on this work is available online.

Gentry's
            scheme

Gentry's fully homomorphic encryption scheme. Information I is manipulated I --> f (I) without being decoded. The process uses a finite-depth homomorphic encryption algorithm E of the self-decrypting type described by Gentry, with public keys pk1, pk2, ... and corresponding private (secret) keys sk1, sk2, ... the number depending on the complexity of f. The function f is expressed as the composition of f1f2f3, ... so that, for example, the complexity of f3 plus the complexity of the decoding algorithm decrypt: [sk2, pk2( f2 f1(I))] --> f2 f1(I) does not exceed the homomorphic depth of E. Each of the private keys sk1, sk2, ... is protected by encryption with the next public key on the list. The dashed arrows represent functions implemented homomorphically on encrypted information.

Tony Phillips
Stony Brook University
tony at math.sunysb.edu