Giggling by petro c.k.

giggling
in the lecture hall
Cox-Zucker machine

by petro c.k.

Within arithmetic geometry, the “Cox-Zucker machine” is an algorithm created by David A. Cox and Steven Zucker to determine the intersection numbers of elliptical surface sections.

When Cox and Zucker met as first-year graduate students at Princeton University, “we realized that we had to write a joint paper because the combination of our last names, in the usual alphabetical order, is remarkably obscene.” Five years later, as members of the faculty at Rutgers, the State University of New Jersey, they followed through with their plan.

Further reading:

‘Cox-Zucker machine’, Wikipedia article: https://en.wikipedia.org/wiki/Cox-Zucker_machine

‘Intersection numbers of sections of elliptic surfaces’, Cox, D.A. & Zucker, S., 1979, Invent Math 53, 1–44. https://doi.org/10.1007/BF01403189

Author bio:

petro c. k. laughs at life’s absurdities. And occasionally writes about them. His work appears in numerous haiku journals and experimental literary/poetry magazines, and he will have debut experimental haiku and dada poetry collections out in 2024. He encourages nanononsense at dadakuku.com You can catch up with petro on Twitter here: @petro_ck

Check out other sciku by petro c. k. here: ‘Saturn’s Moons’, ‘Young Star’, and ‘Marble’.

The Secretary Problem

Optimal stopping
for sexist scenarios –
an odd solution.

The Secretary Problem is the ‘fear of missing out’ in mathematical form. It encompasses the idea of deciding when to stop looking through a list of options when you can’t go back to a previously dismissed option and don’t know what future options might hold. It presents the scenario that you’re hiring a secretary and want to hire the very best. You consider each candidate in turn and must decide to hire or reject them immediately after their interview. You know how good they are and how good all those applicants before them were but you’ve no way of knowing how good the remaining applicants are – the best may be yet to come. When do you hire someone, when do you decide which is best?

It’s a problem involving optimal stopping theory – choosing when to take an action to maximise the expected reward and/or minimise the expected cost. When do you stop rejecting applicants to get the best secretary?

The shortest proof to the problem is the odds algorithm devised by F. Thomas Bruss in 2000. The proof equates to the idea that when hiring a secretary under the strict conditions of the problem, you should view a certain number of applicants and then hire the next applicant that scores higher than any of those. The number of applicants to reject first is defined by ~ n/e where n is the total number of applicants and e is the base of the natural logarithm. This proof of the problem is likely to select the single best applicant from the total pool of applicants around 37% of the time, regardless of how many applicants there are.

Curiously The Secretary Problem has been known by a number of different names, most of which involve men picking between women in some way: the marriage problem, the sultan’s dowry problem, the fussy suitor problem, the googol game, and the best choice problem.

Further reading:

Bruss, 2000, Sum the odds to one and stop – https://doi.org/10.1214%2Faop%2F1019160340

The Secretary Problem – https://en.wikipedia.org/wiki/Secretary_problem

Authorship

Who wrote Beowulf?
Look for stylistic changes –
a single author?

Beowulf is one of the most well-known examples of Old English literature and debate has raged over whether the poem was written by a single author or combined from multiple sources. New research by Neidorf et al (2019) lends support to the single author theory.

Beowulf survives in a single manuscript that has been dated to around AD 1000. Using a statistical approach called stylometry the researchers analysed features of the writing, comparing the poem’s metre, word choices, letter combinations and sense pauses – small pauses between clauses and sentences. They found no evidence for any major stylistic shifts across the poem suggesting that Beowulf is the work of a single author.

Original research: http://dx.doi.org/10.1038/s41562-019-0570-1

God may be defined by John Norwood

God may be defined

P not equal to NP

Thus unprovable

This poem is about cryptography, which makes reference to the famous unsolved P versus NP problem of computer science. Modern cryptographic techniques rely on problems that are hard to compute (NP, or non-polynomial time) yet easy to verify (Polynomial time). If it were provable that such problems don’t exist, then any cryptography could be easily cracked. Our security is reliant upon an un-provable state and the very nature of its un-provabiity is what makes it secure. This fits my personal definition of God as the unknowable and believe the power of faith is rooted in a healthy relationship with that which cannot be known.

The P versus NP problem was first independently formulated by Stephen Cook and Leonid Levin in 1971, although the underlying ideas were considered earlier by John Nash, Kurt Godel and John von Neumann. It is one of the 7 Millennium Problems identified by the Clay Mathematics Institute with a reward of $1 million for the first to propose a solution.

John Norwood is a Mechanical Engineer working with Carbon, Inc. to revolutionize how things are made. His interests include old houses, yoga, baking, cryptography, and bluegrass music. You can follow him on Twitter under the handle @pryoga

Enjoyed this sciku? Check out some of John’s other work: Universal truth, The answer is none, With enough data, Rivers cut corners, and Squeamish ossifrage.

Algorithmic Complexity by Dr David Keyes

It’s not the flop count

But the data location;

The paradigm shifts.

 

Until recently, the analysis of algorithms emphasized finding the minimum number of operations required to complete a task to a given precision – the algorithmic complexity. This is natural when the operations are both the most time-consuming steps of the computation and a reasonable proxy for all other costs, including data motion.

Today, floating point operations (“flops”) are much cheaper in time and in energy than moving the data into and out of the processing unit while memory hierarchies based on multilevel caches deliver operands at latencies and energy costs that vary by several orders of magnitude, depending upon where the freshest copies of the data are found. This situation results in the resurrection of old algorithms and the design of new ones that may do many more flops than previously “optimal” methods, but move less data back and forth from remote locations, and thus finish in less time, with smaller energy expenditure.

The author’s group has created several pieces of software that have displaced previous choices by optimizing memory transfers rather than flops. An example of a singular value decomposition that overcomes a flops handicap of as much as an order of magnitude is given in Sukkari et al (2016). For a community discussion of new paradigms on the path to exascale, see Dongarra et al (2011).

Original research:

Sukkari et al, 2016, https://doi.org/10.1145/2894747

Dongarra et al, 2011 https://doi.org/10.1177/1094342010391989

David Keyes directs the Extreme Computing Research Center at KAUST, where he was a founding dean in 2009. He inhabits the intersection of Mathematics, Computer Science, and applications, with a focus on colonizing emerging energy-efficient architectures for scientific computations. He is a Fellow of SIAM and AMS and has received the ACM Gordon Bell Prize and the IEEE Sidney Fernbach Award. As a lover of poetry, he is delighted to discover the Sciku community.

Enjoyed this sciku? Check out David’s other sciku: Low-rank representation.