# Conference Amplitudes 2015 – Integration Ahoy!

I recall fondly a maths lesson from my teenage years. Dr. Mike Wade – responsible as much an anyone for my scientific passion – was introducing elementary concepts of differentiation and integration. Differentiation is easy, he proclaimed. But integration is a tricky beast.

That prescient warning perhaps foreshadowed my entry into the field of amplitudes. For indeed integration is of fundamental importance in determining the outcome of scattering events. To compute precise “loop corrections” necessarily requires integration. And this is typically a hard task.

Today we were presented with a smorsgasbord of integrals. Polylogarithms were the catch of the day. This broad class of functions covers pretty much everything you can get when computing amplitudes (provided your definition is generous)! So what are they? It fell to Dr. Erik Panzer to remind us.

Laymen will remember logarithms from school. These magic quantities turn multiplication into addition, giving rise to the ubiquitous schoolroom slide rules predating electronic calculators. Depending on your memory of math class, logarithms are either curious and fascinating or strange and terrifying! But boring they most certainly aren’t.

One of the most amusing properties of a logarithm comes about from (you guessed it) integration. Integrating $x^{a-1}$ is easy, you might recall. You’ll end up with $x^a/a$ plus some constant. But what happens when $a$ is zero? Then the formula makes no sense, because dividing by zero simply isn’t allowed.

And here’s where the logarithm comes to the rescue. As if by witchcraft it turns out that

$\displaystyle \int_0^x x^{-1} = -\log (1-x)$

This kind of integral crops when you compute scattering amplitudes. The traditional way to work out an amplitudes is to draw Feynman diagrams – effectively pictures representing the answer. Every time you get a loop in the picture, you get an integration. Every time a particle propagates from A to B you get a fraction. Plug through the maths and you sometimes see integrals that give you logarithms!

But logarithms aren’t the end of the story. When you’ve got many loop integrations involved, and perhaps many propagators too, things can get messy. And this is where polylogarithms come in. They’ve got an integral form like logarithms, only instead of one integration there are many!

$\displaystyle \textrm{Li}_{\sigma_1,\dots \sigma_n}(x) = \int_0^z \frac{1}{z_1- \sigma_1}\int_0^{z_1} \frac{1}{z_2-\sigma_2} \dots \int_0^{z_{n-1}}\frac{1}{z_n-\sigma_n}$

It’s easy to check that out beloved $\log$ function emerges from setting $n=1$ and $\sigma_1=0$. There’s some interesting sociology underlying polylogs. The polylogs I’ve defined are variously known as hyperlogs, generalized polylogs and Goncharov polylogs depending on who you ask. This confusion stems from the fact that these functions have been studied in several fields besides amplitudes, and predictably nobody can agree on a name! One name that is universally accepted is classical polylogs – these simpler functions emerging when you set all the $\sigma$s to zero.

So far we’ve just given names to some integrals we might find in amplitudes. But this is only the beginning. It turns out there are numerous interesting relations between different polylogs, which can be encoded by clever mathematical tools going by esoteric names – cluster algebras, motives and the symbol to name but a few. Erik warmed us up on some of these topics, while also mentioning that even generalized polylogs aren’t the whole story! Sometimes you need even wackier functions like elliptic polylogs.

All this gets rather technical quite quickly. In fact, complicated functions and swathes of algebra are a sad corollary of the traditional Feynman diagram approach to amplitudes. But thankfully there are new and powerful methods on the market. We heard about these so-called bootstraps from Dr. James Drummond and Dr. Matt von Hippel.

The term bootstrap is an old one, emerging in the 1960s to describe methods which use symmetry, locality and unitarity to determine amplitudes. It’s probably a humorous reference to the old English saying “pull yourself up by your bootstraps” to emphasise the achievement of lofty goals from meagre beginnings. Research efforts in the 60s had limited success, but the modern bootstrap programme is going from strength to strength. This is due in part to our much improved understanding of polylogarithms and their underlying mathematical structure.

The philosophy goes something like this. Assume that your answer can be written as a polylog (more precisely as a sum of polylogs, with the integrand expressed as $\prod latex d \log(R_i)$ for appropriate rational functions $R_i$). Now write down all the possible rational functions that could appear, based on your knowledge of the process. Treat these as alphabet bricks. Now put your alphabet bricks together in every way that seems sensible.

The reason the method works is that there’s only one way to make a meaningful “word” out of your alphabet bricks. Locality forces the first letter to be a kinematic invariant, or else your answer would have branch cuts which don’t correspond to physical particles. Take it from me, that isn’t allowed! Supersymmetry cuts down the possibilities for the final letter. A cluster algebra ansatz also helps keep the possibilities down, though a physical interpretation for this is as yet unknown. For $7$ particles this is more-or-less enough to get you the final answer. But weirdly $6$ particles is smore complicated! Counter-intuitive, but hey – that’s research. To fix the six point result you must appeal to impressive all-loop results from integrability.

Next up for these bootstrap folk is higher loops. According to Matt, the $5$-loop result should be gettable. But beyond that the sheer number of functions involved might mean the method crashes. Naively one might expect that the problem lies with having insufficiently many constraints. But apparently the real issue is more prosaic – we just don’t have the computing power to whittle down the options beyond 5-loop.

With the afternoon came a return to Feynman diagrams, but with a twist. Professor Johannes Henn talked us through an ingenious evaluation method based on differential equations. The basic concept has been known for a long time, but relies heavily on choosing the correct basis of integrals for the diagram under consideration. Johannes’ great insight was to use conjectures about the dlog form of integrands to suggest a particularly nice set of basis integrals. This makes solving the differential equations a cinch – a significant achievement!

Now the big question is – when can this new method be applied? As far as I’m aware there’s no proof that this nice integral basis always exists. But it seems that it’s there for enough cases to be useful! The day closed with some experimentally relevant applications, the acid test. I’m now curious as to whether you can link the developments in symbology and cluster algebras with this differential equation technique to provide a mega-powerful amplitude machine…! And that’s where I ought to head to bed, before you readers start to worry about theoretical physicists taking over the world.

Conversations

It was a pleasure to chat all things form factors with Brenda Penante, Mattias Wilhelm and Dhritiman Nandan at lunchtime. Look out for a “on-shell” blog post soon.

I must also thank Lorenzo Magnea for an enlightening discussion on soft theorems. Time to bury my head in some old papers I’d previously overlooked!

# Integrating Differentials in Thermodynamics

I’ve just realised I made a mistake when teaching my statistical physics course last term. Fortunately it was a minor and careless maths mistake, rather than any lack of physics clarity. But it’s time to set the record straight!

Often in thermodynamics you will derive equations in differential form. For example, you might be given some equations of state and asked to derive the entropy of a system using the first law

$\displaystyle dE = TdS - pdV$

My error pertained to exactly such a situation. My students had derived the equation

$\displaystyle dS = (V/E)^{1/2}dE+(E/V)^{1/2}dV$

and were asked to integrate this up to find $S$. Naively you simply integrate each separately and add the answers. But of course this is wrong! Or more precisely this is only correct if you get the limits of integration exactly right.

Let’s return to my cryptic comment about limits of integration later, and for now I’ll recap the correct way to go about the problem. There are four steps.

1. Rewrite it as a system of partial DEs

This is easy – we just have

$\displaystyle \partial S/\partial E = (V/E)^{1/2} \textrm{ and } \partial S / \partial V = (E/V)^{1/2}$

2. Integrate w.r.t. E adding an integration function $g(V)$

Again we do what it says on the tin, namely

$\displaystyle S(E,V) = 2 (EV)^{1/2} + g(V)$

3. Substitute in the $\partial S/\partial V$ equation to derive an ODE for $g$

We get $dG/dV = 0$ in this case, easy as.

4. Solve this ODE and write down the full answer

Immediately we know that $g$ is just a constant function, so we can write

$\displaystyle S(E,V) = 2 (EV)^{1/2} + \textrm{const}$

Contrast this with the wrong answer from naively integrating up and adding each term. This would have produced $4(EV)^{1/2}$, a factor of $2$ out!

So what of my mysterious comment about limits above. Well, because $dS$ is an exact differential, we know that we can integrate it over any path and will get the same answer. This path independence is an important reason that the entropy is a genuine physical quantity, whereas there’s no absolute notion of heat. In particular we can find $S$ by integrating along the $x'$ axis to $x' = x$ then in the $y'$ direction from $(x',y')=(x,0)$ to $(x',y')=(x,y)$.

Mathematically this looks like

$\displaystyle S(E,V) = \int_{(0,0)}^{(E,V)} dS = \int_{(0,0)}^{(E,0)}(V'/E')^{1/2}dE' + \int_{(E,0)}^{(E,V)}(E'/V')^{1/2}dV'$

The first integral now gives $0$ since $V=0$ along the $E$ axis. The second term gives the correct answer $S(E,V) = 2(EV)^{1/2}$ as required.

In case you want a third way to solve this problem correctly, check out this webpage which proves another means of integrating differentials correctly!

So there you have it – your health warning for today. Take care when integrating your differentials!