Particle Physics Planet
February 21, 2017
I’ve had a busy morning teaching and a busy afternoon meeting some interesting people from IBM and elsewhere in connection with Data Innovation Institute business, so just time to mention that I’m looking forward tonight to an event at Cardiff Metropolitan University (whose campus is not far from my house) featuring renowned rugby referee Nigel Owens who, in case you hadn’t realized, is gay. The event is part of the celebrations in Cardiff of LGBT History Month.
I’ll update later with reflections on the evening, but in the meantime here’s some examples of him in action on the rugby field!Follow @telescoper
The Mystery Machine for particles hits the road.
It’s not as flashy as Scooby Doo’s Mystery Machine, but scientists at Virginia Tech hope that their new vehicle will help solve mysteries about a ghost-like phenomena: neutrinos.
The Mobile Neutrino Lab is a trailer built to contain and transport a 176-pound neutrino detector named MiniCHANDLER (Carbon Hydrogen AntiNeutrino Detector with a Lithium Enhanced Raghavan-optical-lattice). When it begins operations in mid-April, MiniCHANDLER will make history as the first mobile neutrino detector in the US.
“Our main purpose is just to see neutrinos and measure the signal to noise ratio,” says Jon Link, a member of the experiment and a professor of physics at Virginia Tech’s Center for Neutrino Physics. “We just want to prove the detector works.”
Neutrinos are fundamental particles with no electric charge, a property that makes them difficult to detect. These elusive particles have confounded scientists on several fronts for more than 60 years. MiniCHANDLER is specifically designed to detect neutrinos' antimatter counterparts, antineutrinos, produced in nuclear reactors, which are prolific sources of the tiny particles.
Fission at the core of a nuclear reactor splits uranium atoms, whose products themselves undergo a process that emits an electron and electron antineutrino. Other, larger detectors such as Daya Bay have capitalized on this abundance to measure neutrino properties.
MiniCHANDLER will serve as a prototype for future mobile neutrino experiments up to 1 ton in size.
Link and his colleagues hope MiniCHANDLER and its future counterparts will find answers to questions about sterile neutrinos, an undiscovered, theoretical kind of neutrino and a candidate for dark matter. The detector could also have applications for national security by serving as a way to keep tabs on material inside of nuclear reactors.
MiniCHANDLER echoes a similar mobile detector concept from a few years ago. In 2014, a Japanese team published results from another mobile neutrino detector, but their data did not meet the threshold for statistical significance. Detector operations were halted after all reactors in Japan were shut down for safety inspections.
“We can monitor the status from outside of the reactor buildings thanks to [a] neutrino’s strong penetration power,” Shugo Oguri, a scientist who worked on the Japanese team, wrote in an email.
Link and his colleagues believe their design is an improvement, and the hope is that MiniCHANDLER will be able to better reject background events and successfully detect neutrinos.
Neutrinos, where are you?
To detect neutrinos, which are abundant but interact very rarely with matter, physicists typically use huge structures such as Super-Kamiokande, a neutrino detector in Japan that contains 50,000 tons of ultra-pure water. Experiments are also often placed far underground to block out signals from other particles that are prevalent on Earth’s surface.
With its small size and aboveground location, MiniCHANDLER subverts both of these norms.
The detector uses solid scintillator technology, which will allow it to record about 100 antineutrino interactions per day. This interaction rate is less than the rate at large detectors, but MiniCHANDLER makes up for this with its precise tracking of antineutrinos.
Small plastic cubes pinpoint where in MiniCHANDLER an antineutrino interacts by detecting light from the interaction. However, the same kind of light signal can also come from other passing particles like cosmic rays. To distinguish between the antineutrino and the riffraff, Link and his colleagues look for multiple signals to confirm the presence of an antineutrino.
Those signs come from a process called inverse beta decay. Inverse beta decay occurs when an antineutrino collides with a proton, producing light (the first event) and also kicking a neutron out of the nucleus of the atom. These emitted neutrons are slower than the light and are picked up as a secondary signal to confirm the antineutrino interaction.
“[MiniCHANDLER] is going to sit on the surface; it's not shielded well at all. So it's going to have a lot of background,” Link says. “Inverse beta decay gives you a way of rejecting the background by identifying the two-part event.”
Monitoring the reactors
Scientists could find use for a mobile neutrino detector beyond studying reactor neutrinos. They could also use the detector to measure properties of the nuclear reactor itself.
A mobile neutrino detector could be used to determine whether a reactor is in use, Oguri says. “Detection unambiguously means the reactors are in operation—nobody can cheat the status.”
The detector could also be used to determine whether material from a reactor has been repurposed to produce nuclear weapons. Plutonium, an element used in the process of making weapons-grade nuclear material, produces 60 percent fewer detectable neutrinos than uranium, the primary component in a reactor core.
“We could potentially tell whether or not the reactor core has the right amount of plutonium in it,” Link says.
Using a neutrino detector would be a non-invasive way to track the material; other methods of testing nuclear reactors can be time-consuming and disruptive to the reactor’s processes.
But for now, Link just wants MiniCHANDLER to achieve a simple—yet groundbreaking—goal: Get the mobile neutrino lab running.
The ABC workshop I co-organised has now started and, despite a few last minutes cancellations, we have gathered a great crowd of researchers on the validation and expansion of ABC methods. Or ABC’ory to keep up with my naming of workshops. The videos of the talks should come up progressively on the BIRS webpage. When I did not forget to launch the recording. The program is quite open and with this size of workshop allows for talks and discussions to last longer than planned: the first days contain several expository talks on ABC convergence, auxiliary or synthetic models, summary constructions, challenging applications, dynamic models, and model assessment. Plus prepared discussions on those topics that hopefully involve several workshop participants. We had also set some time for snap-talks, to induce everyone to give a quick presentation of one’s on-going research and open problems. The first day was rather full but saw a lot of interactions and discussions during and around the talks, a mood I hope will last till Friday! Today in replacement of Richard Everitt who alas got sick just before the workshop, we are conducting a discussion on dimensional issues, part of which is made of parts of the following slides (mostly recycled from earlier talks, including the mini-course in Les Diablerets):
Filed under: Mountains, pictures, Statistics, Travel, University life Tagged: 17w5025, ABC, Approximate Bayesian computation, Banff, BIRS, Canada, convergence, Les Diablerets, Rocky Mountains, synthetic likelihood
February 20, 2017
A new theoretical paper in PRL has extended the Standard Model of elementary particles to include new particles, and tries to mash different ideas and theories into this new standard model called SMASH - Standard Model Axion See-saw Higgs portal inflation (yeah, it's a mouthful).
SMASH adds six new particles to the seventeen fundamental particles of the standard model. The particles are three heavy right-handed neutrinos, a color triplet fermion, a particle called rho that both gives mass to the right-handed neutrinos and drives cosmic inflation together with the Higgs boson, and an axion, which is a promising dark matter candidate. With these six particles, SMASH does five things: produces the matter–antimatter imbalance in the Universe; creates the mysterious tiny masses of the known left-handed neutrinos; explains an unusual symmetry of the strong interaction that binds quarks in nuclei; accounts for the origin of dark matter; and explains inflation.
Of course, with ANY theoretical ideas, which often has long gestation period, a lot of patient waiting and testing will have to be done to verify many of its predictions. But this seems to create quite an excitement in revamping the Standard Model.
As I discovered (!) the Annals of Applied Statistics in my mailbox just prior to taking the local train to Dauphine for the first time in 2017 (!), I started reading it on the way, but did not get any further than the first discussion paper by Pengsheng Ji and Jiashun Jin on coauthorship and citation networks for statisticians. I found the whole exercise intriguing, I must confess, with little to support a whole discussion on the topic. I may have read the paper too superficially as a métro pastime, but to me it sounded more like a post-hoc analysis than a statistical exercise, something like looking at the network or rather at the output of a software representing networks and making sense of clumps and sub-networks a posteriori. (In a way this reminded of my first SAS project at school, on the patterns of vacations in France. It was in 1983 on pinched cards. And we spent a while cutting & pasting in a literal sense the 80 column graphs produced by SAS on endless listings.)
It may be that part of the interest in the paper is self-centred. I do not think analysing a similar dataset in another field like deconstructionist philosophy or Korean raku would have attracted the same attention. Looking at the clusters and the names on the pictures is obviously making sense, if more at a curiosity than a scientific level, as I do not think this brings much in terms of ranking and evaluating research (despite what Bernard Silverman suggests in his preface) or understanding collaborations (beyond the fact that people in the same subfield or same active place like Duke tend to collaborate). Speaking of curiosity, I was quite surprised to spot my name in one network and even more to see that I was part of the “High-Dimensional Data Analysis” cluster, rather than of the “Bayes” cluster. I cannot fathom how I ended up in that theme, as I cannot think of a single paper of mines pertaining to either high dimensions or data analysis [to force the trait just a wee bit!]. Maybe thanks to my joint paper with Peter Mueller. (I tried to check the data itself but cannot trace my own papers in the raw datafiles.)
I also wonder what is the point of looking at solely four major journals in the field, missing for instance most of computational statistics and biostatistics, not to mention machine learning or econometrics. This results in a somewhat narrow niche, if obviously recovering the main authors in the [corresponding] field. Some major players in computational stats still make it to the lists, like Gareth Roberts or Håvard Rue, but under the wrong categorisation of spatial statistics.
Filed under: Books, pictures, R, Statistics, University life Tagged: Annals of Applied Statistics, Annals of Statistics, Biometrika, citation map, coauthors, JASA, Journal of the Royal Statistical Society, JRSSB, network, Series B
Today there has been a day of action under the banner of “One Day Without Us” to celebrate the contribution that migrants make to the United Kingdom and to the counter growing tide of racism and xenophobia associated with some elements of the recent campaign for this country to leave the European Union. Here’s a video produced by the campaign.
I wish to make it clear, as someone who was born in the United Kingdom, that I am appalled by the present government’s refusal to guarantee the rights of the millions of EU citizens who have made this country their home and enriched us all with their presence here. Migrants have had a positive effect over all sectors of the UK economy for a very long time, but I wish to highlight from my own experience the enormous contribution “migrants” – or as I prefer to call them “colleagues” – make to our Universities. Non-UK scientists form the backbone of the School of Physics & Astronomy here at Cardiff, just as they did in the School of Mathematical and Physical Sciences at the University of Sussex. Without them I don’t know how we’d carry on.
Now the Article 50 Bill has begun its progress through the House of Lords, I hope that it can be amended to force the government to stop treating such valuable people in such a despicable way. In the meantime all I can do – and I know it’s only a gesture – is say that the government does not speak for me, or for any of my colleagues, and that I hope and believe that it will be made to abandon its repellent notion that people can be treated like bargaining chips.Follow @telescoper
The mower stalled, twice; kneeling, I found
A hedgehog jammed up against the blades,
Killed. It had been in the long grass.
I had seen it before, and even fed it, once.
Now I had mauled its unobtrusive world
Unmendably. Burial was no help:
Next morning I got up and it did not.
The first day after a death, the new absence
Is always the same; we should be careful
Of each other, we should be kind
While there is still time.
by Philip Larkin (1922-1985)Follow @telescoper
February 19, 2017
A news report last weekend on Nature webpage about the new science super-campus south of Paris connected with my impressions of the whole endeavour: the annual report from the Court of Auditors estimated that the 5 billion euros invested in this construct were not exactly a clever use of public [French taxpayer] money! This notion to bring a large number of [State] engineer and scientific schools from downtown Paris to the plateau of Saclay, about 25km south-west of Paris, around École Polytechnique, had some appeal, since these were and are prestigious institutions, most with highly selective entry exams, and with similar training programs, now that they have almost completely lost the specialisation that justified their parallel existences! And since a genuine university, Paris 11 Orsay, stood nearby at the bottom of the plateau. Plus, a host of startups and research branches of companies. Hence the concept of a French MIT.
However, as so often the case in Jacobin France, the move has been decided and supported by the State “top-down” rather than by the original institutions themselves. Including a big push by Nicolas Sarkozy in 2010. While the campus can be reached by public transportation like RER, the appeal of living and working on the campus is obviously less appealing to both students and staff than in a listed building in the centre of Paris. Especially when lodging and living infrastructures are yet to be completed. But the main issue is that the fragmentation of those schools, labs and institutes, in terms of leadership, recruiting, research, and leadership, has not been solved by the move, each entity remaining strongly attached to its identity, degree, networks, &tc., and definitely unwilling to merge into a super-university with a more efficient organisation of teaching and research. Which means the overall structure as such is close to invisible at the international level. This is the point raised by the State auditors. And perceived by the State which threatens to cut funding at this late stage!
This is not the only example within French higher educations institutions since most have been forced to merged into incomprehensible super-units under the same financial threat. Like Paris-Dauphine being now part of the PSL (Paris Sciences et Lettres) heterogeneous conglomerate. (I suspect one of the primary reasons for this push by central authorities was to create larger entities towards moving up in the international university rankings, which is absurd for many reasons, from the limited worth of such rankings, to the lag between the creation of a new entity and the appearance on an international university ranking, to the difficulty in ranking researchers from such institutions: in Paris-Dauphine, the address to put on papers is more than a line long, with half a dozen acronyms!)
Filed under: Kids, pictures, University life Tagged: École Polytechnique, ENSAE, French politics, French universities, Orsay, Paris-Saclay campus, PSL, Université Paris Dauphine
February 18, 2017
The Azimuth Climate Data Backup Project is going well! Our Kickstarter campaign ended on January 31st and the money has recently reached us. Our original goal was $5000. We got $20,427 of donations, and after Kickstarter took its cut we received $18,590.96.
Next time I’ll tell you what our project has actually been doing. This time I just want to give a huge “thank you!” to all 627 people who contributed money on Kickstarter!
I sent out thank you notes to everyone, updating them on our progress and asking if they wanted their names listed. The blanks in the following list represent people who either didn’t reply, didn’t want their names listed, or backed out and decided not to give money. I’ll list people in chronological order: first contributors first.
Only 12 people backed out; the vast majority of blanks on this list are people who haven’t replied to my email. I noticed some interesting but obvious patterns. For example, people who contributed later are less likely to have answered my email yet—I’ll update this list later. People who contributed more money were more likely to answer my email.
The magnitude of contributions ranged from $2000 to $1. A few people offered to help in other ways. The response was international—this was really heartwarming! People from the US were more likely than others to ask not to be listed.
But instead of continuing to list statistical patterns, let me just thank everyone who contributed.
Daniel Estrada Ahmed Amer Saeed Masroor Jodi Kaplan John Wehrle Bob Calder Andrea Borgia L Gardner Uche Eke Keith Warner Dean Kalahan James Benson Dianne Hackborn Walter Hahn Thomas Savarino Noah Friedman Eric Willisson Jeffrey Gilmore John Bennett Glenn McDavid Brian Turner Peter Bagaric Martin Dahl Nielsen Broc Stenman Gabriel Scherer Roice Nelson Felipe Pait Kenneth Hertz Luis Bruno Andrew Lottmann Alex Morse Mads Bach Villadsen Noam Zeilberger Buffy Lyon Josh Wilcox Danny Borg Krishna Bhogaonker Harald Tveit Alvestrand Tarek A. Hijaz, MD Jouni Pohjola Chavdar Petkov Markus Jöbstl Bjørn Borud Sarah G William Straub Frank Harper Carsten Führmann Rick Angel Drew Armstrong Jesimpson Valeria de Paiva Ron Prater David Tanzer Rafael Laguna Miguel Esteves dos Santos Sophie Dennison-Gibby Randy Drexler Peter Haggstrom Jerzy Michał Pawlak Santini Basra Jenny Meyer John Iskra Bruce Jones Māris Ozols Everett Rubel Mike D Manik Uppal Todd Trimble Federer Fanatic Forrest Samuel, Harmos Consulting Annie Wynn Norman and Marcia Dresner Daniel Mattingly James W. Crosby Jennifer Booth Greg Randolph Dave and Karen Deeter Sarah Truebe Jeffrey Salfen Birian Abelson Logan McDonald Brian Truebe Jon Leland Sarah Lim James Turnbull John Huerta Katie Mandel Bruce Bethany Summer Anna Gladstone Naom Hart Aaron Riley Giampiero Campa Julie A. Sylvia Pace Willisson Bangskij Peter Herschberg Alaistair Farrugia Conor Hennessy Stephanie Mohr Torinthiel Lincoln Muri Anet Ferwerda Hanna Michelle Lee Guiney Ben Doherty Trace Hagemann Ryan Mannion Penni and Terry O'Hearn Brian Bassham Caitlin Murphy John Verran Susan Alexander Hawson Fabrizio Mafessoni Anita Phagan Nicolas Acuña Niklas Brunberg Adam Luptak V. Lazaro Zamora Branford Werner Niklas Starck Westerberg Luca Zenti and Marta Veneziano Ilja Preuß Christopher Flint George Read Courtney Leigh Katharina Spoerri Daniel Risse Hanna Charles-Etienne Jamme rhackman41 Jeff Leggett RKBookman Aaron Paul Mike Metzler Patrick Leiser Melinda Ryan Vaughn Kent Crispin Michael Teague Ben Fabian Bach Steven Canning Betsy McCall John Rees Mary Peters Shane Claridge Thomas Negovan Tom Grace Justin Jones Jason Mitchell Josh Weber Rebecca Lynne Hanginger Kirby Dawn Conniff Michael T. Astolfi Kristeva Erik Keith Uber Elaine Mazerolle Matthieu Walraet Linda Penfold Lujia Liu Keith Samar Tareem Henrik Almén Michael Deakin Erin Bassett James Crook Junior Eluhu Dan Laufer Carl Robert Solovay Silica Magazine Leonard Saers Alfredo Arroyo García Larry Yu John Behemonth Eric Humphrey Øystein Risan Borgersen David Anderson Bell III Ole-Morten Duesend Adam North and Gabrielle Falquero Robert Biegler Qu Wenhao Steffen Dittmar Shanna Germain Adam Blinkinsop John WS Marvin (Dread Unicorn Games) Bill Carter Darth Chronis Lawrence Stewart Gareth Hodges Colin Backhurst Christopher Metzger Rachel Gumper Mariah Thompson Falk Alexander Glade Johnathan Salter Maggie Unkefer Shawna Maryanovich Wilhelm Fitzpatrick Dylan “ExoByte” Mayo Lynda Lee Scott Carpenter Charles D, Payet Vince Rostkowski Tim Brown Raven Daegmorgan Zak Brueckner Christian Page Adi Shavit Steven Greenberg Chuck Lunney Adriel Bustamente Natasha Anicich Bram De Bie Edward L Gray Detrick Robert Sarah Russell Sam Leavin Abilash Pulicken Isabel Olondriz James Pierce James Morrison April Daniels José Tremblay Champagne Chris Edmonds Hans & Maria Cummings Bart Gasiewiski Andy Chamard Andrew Jackson Christopher Wright ichimonji10 Alan Stern Alison W Dag Henrik Bråtane Martin Nilsson William Schrade
After a misty morning it has turned into a lovely Spring-like afternoon here in Cardiff. I’ve been in the School of Physics & Astronomy at Cardiff University this morning, helping out with a UCAS visit day by interviewing some prospective students and then having lunch and chatting with parents and others.
As well as the weather and the admissions season, another indication of the passage of the seasons is the Six Nations rugby. Our Saturday UCAS visit days have to be arranged with the RBS Six Nations fixture list in mind because Cardiff gets incredibly busy when Wales are playing at home. The capacity of the
Millennium Principality Stadium is well over 80,000 which, for a City with a population of just over 300,000 represents a huge perturbation.
Not only is there a lot of traffic and a very crowded city centre, but it’s also very difficult to find hotel accommodation at a reasonable price on match weekends. Given that we start in the morning, quite a few prospective students and their families do stay overnight beforehand so this is quite an important consideration. There are no fixtures in the RBS Six Nations this weekend. Today two of my interviewees had travelled quite a long way to get to Cardiff – one from Richmond in North Yorkshire and another from Falmouth in Cornwall – and both families stayed over last night.
Anyway, while I’m not talking about the Six Nations I can’t resist mentioning last week’s match here in Cardiff between Wales and England. I didn’t have a ticket. I’ve ever really figured out how to get tickets for these matches. They always seem to be completely sold out as soon as they go on sale.
Before the match, I thought it was going to be a close game but Wales always have tremendous home advantage at Cardiff and I thought they might just sneak it. It was a rather dour struggle to be honest, but with less than ten minutes to go Wales were leading 16-14 and my suspicions seemed about to be confirmed. However, as is often the case with close matches, it was an error that produced the decisive moment.
About five metres out, Wales turned possession over and then rucked successfully, the ball eventually going to Jonathan Davies behind his own try line. With half of his team trying to disentangle themselves from the completed ruck, it was essential for him to clear his lines by kicking into touch. Unfortunately, he kicked straight down the field where his kick was collected by George Ford. England’s counter-attack was swift and lethal: Ford to Farrell and then to Elliott Daly on the wing, who went over for the try to the sound of groans all round Cardiff. After the conversion it was Wales 16 England 21, which is how the game ended a few minutes later.
The results of the other games so far mean that the only team capable of winning a grand slam is England, as each of the other teams has lost at least one game. There’s still a long way to go, however, and England still face challenging matches against Ireland and a much-improved Scotland.
Anyway, all this UCAS malarkey means that I’m way behind on Saturday crossword duties, so I’m going home. Toodle-pip.
Guest post by Liang Ze Wong
The Kan Extension Seminar II continues and this week, we discuss Jon Beck’s “Distributive Laws”, which was published in 1969 in the proceedings of the Seminar on Triples and Categorical Homology Theory, LNM vol 80. In the previous Kan seminar post, Evangelia described the relationship between Lawvere theories and finitary monads, along with two ways of combining them (the sum and tensor) that are very natural for Lawvere theories but less so for monads. Distributive laws give us a way of composing monads to get another monad, and are more natural from the monad point of view.
Beck’s paper starts by defining and characterizing distributive laws. He then describes the category of algebras of the composite monad. Just as monads can be factored into adjunctions, he next shows how distributive laws between monads can be “factored” into a “distributive square” of adjunctions. Finally, he ends off with a series of examples.
Before we dive into the paper, I would like to thank Emily Riehl, Alexander Campbell and Brendan Fong for allowing me to be a part of this seminar, and the other participants for their wonderful virtual company. I would also like to thank my advisor James Zhang and his group for their insightful and encouraging comments as I was preparing for this seminar.
First, some differences between this post and Beck’s paper:
I’ll use the standard, modern convention for composition: the composite will be denoted . This would be written in Beck’s paper.
I’ll use the terms “monad” and “monadic” instead of “triple” and “tripleable”.
I’ll rely quite a bit on string diagrams instead of commutative diagrams. These are to be read from right to left and top to bottom. You can learn about string diagrams through these videos or this paper (warning: they read string diagrams in different directions than this post!).
All constructions involving the category of -algebras, , will be done in an “object-free” manner involving only the universal property of .
The last two points have the advantage of making the resulting theory applicable to -categories or bicategories other than , by replacing categories/ functors/ natural transformations with 0/1/2-cells.
Since string diagrams play a key role in this post, here’s a short example illustrating their use. Suppose we have functors and such that . Let be the unit and the counit of the adjunction. Then the composite can be drawn thus:
Most diagrams in this post will not be as meticulously labelled as the above. Unlabelled white regions will always stand for a fixed category . If , I’ll use the same colored string to denote them both, since they can be distinguished from their context: above, goes from a white to red region, whereas goes from red to white (remember to read from right to left!). The composite monad (not shown above) would also be a string of the same color, going from a white region to a white region.
Example 1: Let be the free monoid monad and be the free abelian group monad over . Then the elementary school fact that multiplication distributes over addition means we have a function for a set, sending , say, to . Further, the composition of with is the free ring monad, .
Example 2: Let and be monoids in a braided monoidal category . Then is also a monoid, with multiplication:
where is provided by the braiding in .
In example 1, there is also a monoidal category in the background: the category of endofunctors on . But this category is not braided – which is why we need distributive laws!
Distributive laws, composite and lifted monads
Let and be monads on a category . I’ll use Scarlet and Teal strings to denote and , resp., and white regions will stand for .
A distributive law of over is a natural transformation , denoted
satisfying the following equalities:
A distributive law looks somewhat like a braiding in a braided monoidal category. In fact, it is a local pre-braiding: “local” in the sense of being defined only for over , and “pre” because it is not necessarily invertible.
As the above examples suggest, a distributive law allows us to define a multiplication :
It is easy to check visually that this makes a monad, with unit . For instance, the proof that is associative looks like this:
Not only is a monad, we also have monad maps and :
Asserting that is a monad morphism is the same as asserting these two equalities:
Similar diagrams hold for . Finally, the multiplication also satisfies a middle unitary law:
To get back the distributive law, we can simply plug the appropriate units at both ends of :
This last procedure (plugging units at the ends) can be applied to any . It turns out that if happens to satisfy all the previous properties as well, then we also get a distributive law. Further, the (distributive law multiplication) and (multiplication distributive law) constructions are mutually inverse:
Theorem The following are equivalent: (1) Distributive laws ; (2) multiplications such that is a monad, and are monad maps, and the middle unitary law holds.
In addition to making a monad, distributive laws also let us lift to the category of -algebras, . Before defining what we mean by “lift”, let’s recall the universal property of : Let be another category; then there is an isomorphism of categories between – the category of functors and natural transformations between them, and - – the category of functors equipped with an -action and natural transformations that commute with the -action.
Given , we get a functor by composing with . This composite has an -action given by the canonical action on . The universal property says that every such functor with an -action is of the form . Similar statements hold for natural transformations. We will call and lifts of and , resp.
A monad lift of to is a monad on such that
We may express via the following equivalent commutative diagrams:
The diagram on the right makes it clear that being a monad lift of is equivalent to being lifts of , resp. Thus, to get a monad lift of , it suffices to produce an -action on and check that it is compatible with and . We may simply combine the distributive law with the canonical -action on to obtain the desired action on :
(Recall that the unlabelled white region is . In subsequent diagrams, we will leave the red region unlabelled as well, and this will always be . Similarly, teal regions will denote .)
Conversely, suppose we have a monad lift of . Then the equality can be expressed by saying that we have an invertible natural transformation . Using and the unit and counit of the adjunction that gives rise to , we obtain a distributive law of over :
The key steps in the proof that these constructions are mutually inverse are contained in the following two equalities:
The first shows that the resulting distributive law in the (distributive law monad lift distributive law) construction is the same as the original distributive law we started with. The second shows that in the (monad lift distributive law another lift ) construction, the -action on (LHS of the equation) is the same as the original -action on (RHS), hence (by virtue of being lifts, and can only differ in their induced -actions on ). We thus have another characterization of distributive laws:
Theorem The following are equivalent: (1) Distributive laws ; (3) monad lifts of to .
In fact, the converse construction did not rely on the universal property of , and hence applies to any adjunction giving rise to (with a suitable definition of a monad lift of in this situation). In particular, it applies to the Kleisli adjunction . Since the Kleisli category is equivalent to the subcategory of free -algebras (in the classical sense) in , this means that to get a distributive law of over , it suffices to lift to a monad over just the free -algebras! (Thanks to Jonathan Beardsley for pointing this out!) The resulting distributive law may be used to get another lift of , but we should not expect this to be the same as the original lift unless the original lift was “monadic” to begin with, in the sense of being a lift to .
There are two further characterizations of distributive laws that are not mentioned in Beck’s paper, but whose equivalences follow easily. Eugenia Cheng in Distrbutive laws for Lawvere theories states that distributive laws of over are also equivalent to extensions of to a monad on the Kleisli category . This follows by duality from the above theorem, since . Finally, Ross Street’s The formal theory of monads (which was also covered in a previous Kan Extension Seminar post) says that distributive laws in a -category are precisely monads in . It is a fun and easy exercise to draw string diagrams for objects of ; it becomes visually obvious that these are the same as distributive laws.
Algebras for the composite monad
After characterizing distributive laws, Beck characterizes the algebras for the composite monad .
Just as a morphism of rings induces a “restriction of scalars” functor --, the monad maps and induce functors and .
Equivalently, we have - and -actions on , which we call and . Let be the canonical -action on . The middle unitary law then implies that :
Further, is distributes over in the following sense:
The properties of these actions allow us to characterize -algebras:
Theorem The category of algebras for coincides with that of :
To prove this, Beck constructs and its inverse . These constructions are best summarized in the following diagram of lifts:
On the left half of the diagram, we see that to get , we must first produce a functor with a -action. We already have as a lift of , given by the -action . We also have the -action on , which distributes over. This is precisely what is required to get a lift of to a -action on , which gives us .
On the right half of the diagram, to get we need to produce a functor with a -action. The obvious functor is , and we get an action by using the canonical actions of on and on :
All that’s left to prove the theorem is to check that and are inverses. In a similar fashion, we can prove the dual statement (again found in Cheng’s paper but not Beck’s):
Theorem The Kleisli category of coincides with that of :
Distributivity for adjoints
From now on, we identify with . Under this identification, it turns out that , and we obtain what Beck calls a distributive adjoint situation comprising 3 pairs of adjunctions:
For this to qualify as a distributive adjoint situation, we also require that both composites from to are naturally isomorphic, and both composites from to are naturally isomorphic. This can be expressed in the following diagram by requiring both blue circles to be mutually inverse, and both red circles to be mutually inverse:
(Recall that colored regions are categories of algebras for the corresponding monads, and the cap and cup are the unit and counit of .)
This diagram is very similar to the diagram for getting a distributive law out of a lift , and it is easy to believe that any such distributive adjoint situation (with 3 pairs of adjoints - not necessarily monadic - and the corresponding natural isomorphisms) leads to a distributive law.
Finally, suppose the “restriction of scalars” functor has an adjoint. This adjoint behaves like an “extension of scalars” functor, and Beck fittingly calls it at the start of his paper. I’ll use instead, to highlight its relationship with .
In such a situation, we get an adjoint square consisting of 4 pairs of adjunctions. By drawing these 4 adjoints in the following manner, it becomes clear which natural transformations we require in order to get a distributive law:
(Recall that and , so this is a “thickened” version of what a distributive law looks like.)
It turns out that given the natural transformation between the composite right adjoints and , we can get the natural transformation as the mate of between the corresponding composite left adjoints and . Note that is invertible if and only if is. We may use or , along with the units and counits of the relevant adjunctions, to construct :
But is in the wrong direction, so we have to further require that is invertible, to get . We get from or in a similar manner. Since will turn out to already be in the right direction, we will not require it to be invertible. Finally, given any 4 pairs of adjoints that look like the above, along with natural transformations satisfying the above properties, we will get a distributive law!
Beck ends his paper with some examples, two of which I’ve already mentioned at the start of this post. During our discussion, there were some remarks on these and other examples, which I hope will be posted in the comments below. Instead of repeating those examples, I’d like to end by pointing to some related works:
Since we’ve been talking about Lawvere theories, we can ask what distributive laws look like for Lawvere theories. Cheng’s Distributive laws for Lawvere theories, which I’ve already referred to a few times, does exactly that. But first, she comes up with 4 settings in which to define Lawvere theories! She also has a very readable introduction to the correspondence between Lawvere theories and finitary monads.
As Beck mentions in his paper, we can similarly define distributive laws between comonads, as well as mixed distributive laws between a monad and a comonad. Just as we can define bimonoids/bialgebras, and thus Hopf monoids/algebras, in a braided monoidal category, such distributive laws allow us to define bimonads and Hopf monads. There are in fact two distinct notions of Hopf monads: the first is described in this paper by Alain Bruguières and Alexis Virelizier (with a follow-up paper coauthored with Steve Lack, and a diagrammatic approach with amazing surface diagrams by Simon Willerton); the second is this paper by Bachuki Mesablishvili and Robert Wisbauer. The difference between these two approaches is described in the Mesablishvili-Wisbauer paper, but both involve mixed distributive laws. Gabriella Böhm also recently gave a talk entitled The Unifying Notion of Hopf Monad, in which she shows how the many generalizations of Hopf algebras are just instances of Hopf monads (in the first sense) in an appropriate monoidal bicategory!
We also saw that distributive laws are monads in a category of monads. Instead of thinking of distributive laws as merely a means of composing monads, we can study distributive laws as objects in their own right, just as monoids in a category of monoids (a.k.a. abelian monoids) are studied in their own right! The story for monoids terminates at this step: monoids in abelian monoids are just abelian monoids. But for distributive laws, we can keep going! See Cheng’s paper on Iterated Distributive Laws, where she shows the connection between iterated distributive laws and -categories. In addition to requiring distributive laws between each pair of monads involved, it is also necessary to have a Yang-Baxter equation between every three monads:
- Finally, there seems to be a strange connection between distributive laws and factorization systems (e.g. here, here and even in “Distributive Laws for Lawvere theories” mentioned above). I can’t say more because I don’t know much about factorization systems, but hopefully someone else can say something illuminating about this!
February 17, 2017
See Fermilab physicist Anne Schukraft's answers to readers’ questions about neutrinos.
I’ve been meaning to do a blog post about Erik Verlinde’s very interesting “Emergent Gravity” theory since it was first aired in November 2016, but never got round to it. However, this recent paper suggests that the new theory fails badly on scales of the Solar System. And when I say “badly”, I mean by seven orders of magnitude. That’s pretty bad.
Unless there’s something wrong with this analysis, this looks pretty terminal …
It was recently proposed that the effects usually attributed to particle dark matter on galaxy scales are due to the displacement of dark energy by baryonic matter, a paradigm known as emergent gravity. This formalism leads to predictions similar to Modified Newtonian Dynamics (MOND) in spherical symmetry, but not quite identical. In particular, it leads to a well defined transition between the Newtonian and the modified gravitational regimes, a transition depending on both the Newtonian acceleration and its first derivative with respect to radius. Under the hypothesis of the applicability of this transition to aspherical systems, we investigate whether it can reproduce observed galaxy rotation curves. We conclude that the formula leads to marginally acceptable fits with strikingly low best-fit distances, low stellar mass-to-light ratios, and a low Hubble constant. In particular, some unobserved wiggles are produced in rotation curves because of the dependence of the transition on the…
View original post 104 more words
February 16, 2017
Why can a neutrino pass through solid objects?
Physicist Anne Schukraft of Fermi National Accelerator Laboratory explains.
February 14, 2017
In the second instalment of the functional equations course that I’m teaching, I introduced Shannon entropy. I also showed that up to a constant factor, it’s uniquely characterized by a functional equation that it satisfies: the chain rule.
Notes for the course so far are here. For a quick summary of today’s session, read on.
You can read the full story in the notes, but here I’ll state the main result as concisely as I can.
For , let denote the set of probability distributions on . The Shannon entropy of is
we obtain a composite distribution
The chain rule for states that
So, is a sequence of continuous functions satisfying the chain rule. Clearly, the same is true of any nonnegative scalar multiple of .
Theorem (Faddeev, 1956) The only sequences of continuous functions satisfying the chain rule are the scalar multiples of entropy.
One interesting aspect of the proof is where the difficulty lies. Let be continuous functions satisfying the chain rule; we have to show that is proportional to . All the effort and ingenuity goes into showing that is proportional to when restricted to the uniform distributions. In other words, the hard part is to show that there exists a constant such that
for all . But once that’s done, showing that is a pushover. The notes show you how!
Standard Model predictions align with the LHCb experiment’s observation of an uncommon decay.
The Standard Model is holding strong after a new precision measurement of a rare subatomic process.
For the first time, the LHCb experiment at CERN has independently observed the decay of the Bs0 particle—a heavy composite particle consisting of a bottom antiquark and a strange quark—into two muons. The LHCb experiment co-discovered this rare process in 2015 after combining results with the CMS experiment.
Theorists predicted that this particular decay would occur only a few times out of a billion.
“Our measurement is slightly lower than predictions, but well within the range of experimental uncertainty and fully compatible with our models,” says Flavio Archilli, one of the co-leaders of this analysis and a postdoc at Nikhef National Institute for Subatomic Physics. “The theoretical predictions are very accurate, so now we want to improve our precision to see if our measurement is sitting right on top of the expected value or slightly outside, which could be an indication of new physics.”
The LHCb experiment examines the properties and decay patterns of particles to search for cracks in the Standard Model, our best description of the fundamental particles and forces. Any deviations from the Standard Model’s predictions could be evidence of new physics at play.
Supersymmetry, for example, is a popular theory that adds a host of new particles to the Standard Model and ameliorates many of its shortcomings—such as mathematical imbalances between how the different types of particles contribute to subatomic interactions.
“We love this decay because it is one of the most promising places to search for any new effects of supersymmetry,” Archilli says. “Scientists searched for this decay for more than 30 years and now we finally have the first single-experiment observation.”
This new measurement by the LHCb experiment combines data taken from Run 1 and Run 2 of the Large Hadron Collider and employs more refined analysis techniques, making it the most precise measurement of this process to date. In addition to measuring the rate of this rare decay, LHCb researchers also measured how long the Bs0 particle lives before it transforms into the two muons—another measurement that agrees with the Standard Model’s predictions.
“It's gratifying to have achieved these results,” says Universita di Pisa scientist Matteo Rama, one of the co-leaders of this analysis. "They reward the efforts made to improve the analysis techniques, to exploit our data even further. We look forward to updating the measurement with more data with the hope to observe, one day, significant deviations from the Standard Model predictions."
Advance your romance with science.
This Valentine’s Day, we challenged our readers to send us physics-inspired love poems. You answered the call: We received dozens of submissions—in four different languages! You can find some of our favorite entries below.
But first, as a warm-up, enjoy a video of real scientists at Fermi National Accelerator Laboratory reciting physics-related Valentine’s Day haiku:
Or read the haiku for yourself:
Thanks to all of our readers who submitted poems! In no particular order, here are some of our favorites:
For now, I’m seeing other quarks, some charming and some strange
But when we meet, I know we will all physics rearrange
For you, stop squark, will soon reveal the standard model as deficient
To me, you are my superpartner; the only one sufficient.
Without you, I just spin one-half of what our world could be
But you and I will couple soon in perfect symmetry.
All fundamental forces, we are meant to unify
In brilliant theory only love itself could clarify
Now though I may seem hypercharged and strongly interactive,
I must show my true colors if I hope to be attractive.
Without you, I just don’t feel really quite just like a top
But I’m confident I will yet find love in the name of stop.
- Jared Sagoff
The gravity that
Pulls my soul to you dilates:
Your beauty slows time.
- Philip Michaels
A Valentine for Two Quarks
Some people wish for one true love,
like dear old Ma and Pa.
That lifestyle’s not for us; we like
our quark ménage à trois.
You see, some like a threesome,
and I love both of you.
No green quark would be seen without
a red quark and a blue.
The sea is full of other quarks,
but darlings, I don’t heed ‘em.
You must believe I don’t exploit
my asymptotic freedom.
And when you pull away from me,
I just can’t take the stress.
My attraction just grows stronger
With you, my life is colourless;
you bring stability.
Without you, I’m unstable,
so I need you, Q.C.D.
I love our quirky, quarky love.
My Valentines, let’s carry on
exchanging gluons wantonly,
and make a little baryon.
- Cheryl Patrick
Will it work this time?
The wavefunction collapses.
Single once again.
Our hearts were once close; two nucleons held tight
By a force that was strong, and a love that burned bright.
But, that force became weaker as the days faded ‘way,
And with it, our bond began to decay.
I’ve realize that opposites don’t always attract
(Otherwise, the atom would be more compact),
And opposites we were, our differences great,
Continuing this way, we’d annihilate.
In truth, I’ve quite had it with your duality,
Your warm disposition; cold mentality.
We must be entangled - what else can explain
How, though we are distant, you still cause me pain?
We’ve exchanged mediators, but our half-lives were short,
All data suggests we should promptly abort.
Our collision is over, and signatures thereof
Have vanished, leaving us not a quantum of love.
- Peter Voznyuk
Love ignited light,
Eternal and everywhere:
A Cosmic Background
- Akshay Jogoo
Like energy dear
our love will last forever,
- Lauren Brennan
Barbora Bruant Gulejova, Fellow delegate
Married, 1 child, 36 years, Fellow IR-ECO
I started working at CERN in July 2014 as a User in the domain of knowledge transfer (Head of Community Activities of High Energy Physics Technology Transfer Network – HEPTech). Six months later I became a Fellow in Education and Outreach group (today ECO) where I have been working for more than two years on different projects including IPPOG (International Particle Physics Outreach Group) in the framework of my function of Scientific Secretary.
I have a diverse profile, a PhD in Thermonuclear Fusion, a Master’s in Management, and experience in scientific editing in international organizations. Working at CERN has been the best and most enriching experience in my career. The stimulating environment of the Organization opens new perspectives, allows to develop new skills and supports creativity. I respect CERN values and I am proud to work here. Moreover, if one has a possibility to contribute to the well-being of the Organization and his/her colleagues, it becomes even more rewarding. This is where Staff Association plays a role. It is a statutory organ here to listen to CERN employees, including Fellows, and defend them. The Staff Association is driven by the willingness to maintain excellent working conditions for all colleagues.
Before coming to CERN, I had already lived and worked in Switzerland for 11 years and my family lives in France. Thus, I know the employment conditions and social security system of these two Host States well. I have been investigating these conditions for Fellows at CERN even well before I became a Fellow. Fellows are employed by CERN but unlike staff members they have no unemployment insurance in the current system. You may think that it is not important for you today, but in cases where a Fellow finds herself at maternity leave at the end of her contract or if a Fellow encounters a long-term health issue during or at the end of his/her contract, there can be a situation of hardship without any income. I consider that it is worthwhile to try to find solutions for this and that was the principal reason I joined the Staff Association.
In this sense, a lot has been done for the Fellows by the Staff Association together with the Diversity Office. As a Fellow delegate, I had the chance to concentrate my efforts to contribute to the proposals for the 5-yearly review. Even if all the proposals of the Staff Association were not retained, today we can be grateful for increased parental benefits, for example the extension of health care for the whole period of maternity leave, longer paternity leave, increased flexibility for new parents, access to teleworking, and recognition of registered partnerships. See: http://diversity.web.cern.ch/diversity-measures-5-yearly-review.
It is my pleasure and honour to represent the needs and the voice of more than 700 Fellows at CERN. However, together with a colleague Fellow, Jiri, we are only a small team. There are 5 seats for Fellows in the Staff Council and our voice could be much stronger if they were filled. Moreover, Fellows’ contracts are up to 3 years and without continuity there is no success. We hope to have more of you, Fellow friends, on board for the next mandate starting in 2018.
The Staff Association is a friendly and enriching place where you can meet more experienced staff colleagues from all CERN departments and learn a lot about the Organization you work for.
I encourage my colleague Fellows to become members of the Staff Association and to represent the Fellows in the future Staff Council by participating in the elections in autumn 2017.
Let's get together, meet each other, exchange experiences and ideas, and share useful information on CERN and the Staff Association. Join us for Fellow's Apéro, organised by the Staff Association on Tuesday 21 February at 16.30 in Restaurant 1. There will be drinks and snacks for everybody! We look forward to seeing you there! Please confirm your participation on Doodle http://doodle.com/poll/skvm7ucm2z78i6bt or alternatively on Facebook https://www.facebook.com/events/1862757017340069/.
Your delegates in the Staff Association,
Barbora & Jiri
February 13, 2017
This semester, I’m teaching a seminar course on functional equations. Why? Among other reasons:
Because I’m interested in measures of biological diversity. Dozens (or even hundreds?) of diversity measures have been proposed, but it would be a big step forward to have theorems of the form: “If you want your measure to have this property, this property, and this property, then it must be that measure. No other will do.”
Because teaching a course on functional equations will force me to learn about functional equations.
Because it touches on lots of mathematically interesting topics, such as entropy of various kinds and the theory of large deviations.
Today was a warm-up, focusing on Cauchy’s functional equation: which functions satisfy
(I wrote about this equation before when I discovered that one of the main references is in Esperanto.) Later classes will look at entropy, means, norms, diversity measures, and a newish probabilistic method for solving functional equations.
Read on for today’s notes and an outline of the whole course.
I don’t want to commit to TeXing up notes every week, as any such commitment would suck joy out of something I’m really doing for intellectual fulfilment (also known as “fun”). However, I seem to have done it this week. Here they are. For those who came to the class, the parts in black ink are pretty much exactly what I wrote on the board.
Here’s the overall plan. We’ll take it at whatever pace feels natural, so the section numbers below don’t correspond to weeks. The later sections are pretty tentative — plans might change!
Warm-up Which functions satisfy ? Which functions of two variables can be separated as a product of functions of one variable?
Shannon entropy Basic ideas. Characterizations of entropy by Shannon, Faddeev, Rényi, etc. Relative entropy.
Deformed entropies Rényi and “Tsallis” entropies. Characterizations of them. Relative Rényi entropy.
Probabilistic methods Cramér’s large deviation theorem. Characterization of -norms and power means.
Diversity of a single community Background and introduction. Properties of diversity measures. Value. Towards a uniqueness theorem.
Diversity of a metacommunity Background: diversity within and between subcommunities; beta-diversity in ecology. Link back to relative entropy. Properties.
You may have been following the ‘Division algebra and supersymmetry’ story, the last instalment of which appeared a while ago under the title M-theory, Octonions and Tricategories. John (Baez) was telling us of some work by his former student John Huerta which relates these entities. The post ends with a declaration which does not suffer from comparison to Prospero’s in The Tempest
But this rough magic
I here abjure. And when I have required
Some heavenly music – which even now I do –
To work mine end upon their senses that
This airy charm is for, I’ll break my staff,
Bury it certain fathoms in the earth,
And deeper than did ever plummet sound
I’ll drown my book.
Well, maybe not quite so poetic:
And with the completion of this series, I can now relax and forget all about these ideas, confident that at this point, the minds of a younger generation will do much better things with them than I could.
Anyway, you may be interested to know that the younger generation has pressed on. John Huerta teamed up with Urs Schreiber to write M-theory from the Superpoint (updated versions here), which looks to grow out of a mere superpoint Lorentzian spacetimes, D-branes and M-branes by the simple device of successive invariant higher central extensions.
It’s like a magical Whitehead tower where you can’t see how they put the rabbit in.
Construction has officially launched for the LZ next-generation dark matter experiment.
The race is on to build the most sensitive US-based experiment designed to directly detect dark matter particles. Department of Energy officials have formally approved a key construction milestone that will propel the project toward its April 2020 goal for completion.
The LUX-ZEPLIN experiment, which will be built nearly a mile underground at the Sanford Underground Research Facility in Lead, South Dakota, is considered one of the best bets yet to determine whether theorized dark matter particles known as WIMPs (weakly interacting massive particles) actually exist.
The fast-moving schedule for LZ will help the US stay competitive with similar next-gen dark matter direct-detection experiments planned in Italy and China.
On February 9, the project passed a DOE review and approval stage known as Critical Decision 3, which accepts the final design and formally launches construction.
“We will try to go as fast as we can to have everything completed by April 2020,” says Murdock “Gil” Gilchriese, LZ project director and a physicist at Lawrence Berkeley National Laboratory, the lead lab for the project. “We got a very strong endorsement to go fast and to be first.” The LZ collaboration now has about 220 participating scientists and engineers who represent 38 institutions around the globe.
The nature of dark matter—which physicists describe as the invisible component or so-called “missing mass” in the universe —has eluded scientists since its existence was deduced through calculations by Swiss astronomer Fritz Zwicky in 1933.
The quest to find out what dark matter is made of, or to learn whether it can be explained by tweaking the known laws of physics in new ways, is considered one of the most pressing questions in particle physics.
Successive generations of experiments have evolved to provide extreme sensitivity in the search that will at least rule out some of the likely candidates and hiding spots for dark matter, or may lead to a discovery.
LZ will be at least 50 times more sensitive to finding signals from dark matter particles than its predecessor, the Large Underground Xenon experiment, which was removed from Sanford Lab last year to make way for LZ. The new experiment will use 10 metric tons of ultra-purified liquid xenon to tease out possible dark matter signals.
“The science is highly compelling, so it’s being pursued by physicists all over the world,” says Carter Hall, the spokesperson for the LZ collaboration and an associate professor of physics at the University of Maryland. “It's a friendly and healthy competition, with a major discovery possibly at stake.”
A planned upgrade to the current XENON1T experiment at National Institute for Nuclear Physics’ Gran Sasso Laboratory in Italy, and China's plans to advance the work on PandaX-II, are also slated to be leading-edge underground experiments that will use liquid xenon as the medium to seek out a dark matter signal. Both of these projects are expected to have a similar schedule and scale to LZ, though LZ participants are aiming to achieve a higher sensitivity to dark matter than these other contenders.
Hall notes that while WIMPs are a primary target for LZ and its competitors, LZ’s explorations into uncharted territory could lead to a variety of surprising discoveries. “People are developing all sorts of models to explain dark matter,” he says. “LZ is optimized to observe a heavy WIMP, but it’s sensitive to some less-conventional scenarios as well. It can also search for other exotic particles and rare processes.”
LZ is designed so that if a dark matter particle collides with a xenon atom, it will produce a prompt flash of light followed by a second flash of light when the electrons produced in the liquid xenon chamber drift to its top. The light pulses, picked up by a series of about 500 light-amplifying tubes lining the massive tank—over four times more than were installed in LUX—will carry the telltale fingerprint of the particles that created them.
Daniel Akerib, Thomas Shutt and Maria Elena Monzani are leading the LZ team at SLAC National Accelerator Laboratory. The SLAC effort includes a program to purify xenon for LZ by removing krypton, an element that is typically found in trace amounts with xenon after standard refinement processes. “We have already demonstrated the purification required for LZ and are now working on ways to further purify the xenon to extend the science reach of LZ,” Akerib says.
SLAC and Berkeley Lab collaborators are also developing and testing hand-woven wire grids that draw out electrical signals produced by particle interactions in the liquid xenon tank. Full-size prototypes will be operated later this year at a SLAC test platform. “These tests are important to ensure that the grids don't produce low-level electrical discharge when operated at high voltage, since the discharge could swamp a faint signal from dark matter,” Shutt says.
Hugh Lippincott, a Wilson Fellow at Fermi National Accelerator Laboratory and the physics coordinator for the LZ collaboration, says, “Alongside the effort to get the detector built and taking data as fast as we can, we’re also building up our simulation and data analysis tools so that we can understand what we’ll see when the detector turns on. We want to be ready for physics as soon as the first flash of light appears in the xenon.” Fermilab is responsible for implementing key parts of the critical system that handles, purifies, and cools the xenon.
All of the components for LZ are painstakingly measured for naturally occurring radiation levels to account for possible false signals coming from the components themselves. A dust-filtering cleanroom is being prepared for LZ's assembly and a radon-reduction building is under construction at the South Dakota site—radon is a naturally occurring radioactive gas that could interfere with dark matter detection. These steps are necessary to remove background signals as much as possible.
The vessels that will surround the liquid xenon, which are the responsibility of the UK participants of the collaboration, are now being assembled in Italy. They will be built with the world's most ultra-pure titanium to further reduce background noise.
To ensure unwanted particles are not misread as dark matter signals, LZ's liquid xenon chamber will be surrounded by another liquid-filled tank and a separate array of photomultiplier tubes that can measure other particles and largely veto false signals. Brookhaven National Laboratory is handling the production of another very pure liquid, known as a scintillator fluid, that will go into this tank.
The cleanrooms will be in place by June, Gilchriese says, and preparation of the cavern where LZ will be housed is underway at Sanford Lab. Onsite assembly and installation will begin in 2018, he adds, and all of the xenon needed for the project has either already been delivered or is under contract. Xenon gas, which is costly to produce, is used in lighting, medical imaging and anesthesia, space-vehicle propulsion systems, and the electronics industry.
“South Dakota is proud to host the LZ experiment at SURF and to contribute 80 percent of the xenon for LZ,” says Mike Headley, executive director of the South Dakota Science and Technology Authority (SDSTA) that oversees the facility. “Our facility work is underway and we’re on track to support LZ’s timeline.”
UK scientists, who make up about one-quarter of the LZ collaboration, are contributing hardware for most subsystems. Henrique Araújo, from Imperial College London, says, “We are looking forward to seeing everything come together after a long period of design and planning.”
Kelly Hanzel, LZ project manager and a Berkeley Lab mechanical engineer, adds, “We have an excellent collaboration and team of engineers who are dedicated to the science and success of the project.” The latest approval milestone, she says, “is probably the most significant step so far,” as it provides for the purchase of most of the major components in LZ’s supporting systems.
Major support for LZ comes from the DOE Office of Science’s Office of High Energy Physics, South Dakota Science and Technology Authority, the UK’s Science & Technology Facilities Council, and by collaboration members in South Korea and Portugal.
Editor's note: This article is based on a press release published by Berkeley Lab.
Du 20 février au 3 mars 2017
CERN Meyrin, Bâtiment principal
Au départ, un toujours même point minuscule posé au centre de ce que la toile est un espace. Une réplique d'autres points, condensés, alignés, isolés, disséminés construiront dans leur extension, la ligne.
Ces lignes, croisées, courbées, déviées, prolongées, seront la structure contenant et séparant la matière des couleurs.
La rotation de chaque toile en cours d'exécution va offrir un accès illimité à la non-forme et à la forme.
Le point final sera l'ouverture sur différents points de vue de ce que le point et la ligne sont devenus une représentation pour l'œil et l'imaginaire.
A travers la peinture, véhiculée par le geste précis, je réfléchis, je cherche et je parcours dans le minuscule des points, les possibles illimités de la transformation .
Pour plus d’informations : firstname.lastname@example.org | Tél: 022 766 37 38
Enrolments for the school year 2017-2018 to the Nursery, the Kindergarten and the School will take place on
6, 7 and 8 March 2017 from 10 am to 1 pm at EVE and School.
Registration forms will be available from Thursday 2nd March.
More information on the website: http://nurseryschool.web.cern.ch/.
Saturday 4 March 2017
Open day at EVE and School
of CERN Staff Association
Are you considering enrolling your child to the Children’s Day-Care Centre EVE and School of the CERN Staff Association?
If you work at CERN, then this event is for you: come visit the school and meet the Management
on Saturday 4 March 2017 from 10 to 12 am
We look forward to welcoming you and will be delighted to present our structure, its projects and premises to you, and answer all of your questions.
Sign up for one of the two sessions on Doodle via the link below before Wednesday 1st March 2017 : http://doodle.com/poll/gbrz683wuvixk8as
Wednesday 15 February 2017 at 20:00
CERN Council Chamber
Directed by Richard Linklater
USA, 2001, 99 minutes
This is the story of a boy who has a dream that he can float, but unless he holds on, he will drift away into the sky. Even when he is grown up, this idea recurs. After a strange accident, he walks through what may be a dream, flowing in and out of scenarios and encountering various characters. People he meets discuss science, philosophy and the life of dreaming and waking, and the protagonist gradually becomes alarmed that he cannot awake from this confusing dream adventure.
Original version English; French subtitles
Wednesday 22 February 2017 at 20:00
CERN Council Chamber
Directed by Satoshi Kon
Japan, 2006, 90 minutes
When a machine that allows therapists to enter their patients' dreams is stolen, all Hell breaks loose. Only a young female therapist, Paprika, can stop it.
Original version Japanese; English subtitles
February 12, 2017
During winter conferences, which take place between mid-February and the end of March in La Thuile, Lake Louise, and other fashionable places close to ski resorts, experimentalists gather to show off their latest results. The same ritual repeats during the summer in a few more varied locations around the world.
February 10, 2017
Think you can do better than the Symmetry staff? Send us your poems!
Has the love of your life fallen for particle physics? Let the Symmetry team help you reach their heart—with haiku.
On Valentine’s Day, we will publish a collection of physics-related love poems written by Symmetry staff and—if you are so inclined—by readers like you!
Send your poems (haiku format optional) to email@example.com by Monday, February 13, at 10 a.m. Central. If we really like yours, we may send you a prize.
For inspiration, consider the following:
The one impact that I had mentioned a few years ago is also mentioned here, and that had to do with not only the impact of budget cuts, but also the devastating impact of a budget cut AFTER several months of continuing resolution of the US budget.
I remember one year on December first, we had a faculty meeting where we heard funding levels would be up 10% across the board — a miraculous state of affairs after multiple years of flat-flat budgets (meaning no budgetary increases for cost of living adjustments — which ultimately means it’s a 3% cut). At our next faculty meeting on December fifteenth, we heard that it was going to be a flat-flat year — par for the course. On December nineteenth, we hear the news that there was a 30% cut in funding levels.Now losing 30% of your budget is very bad in all circumstances, but you have to remember that the fiscal year begins on October first. The only thing you can do is fire people since all the funding is salaries and to do that legally takes about six weeks and with the holiday shutdown, that meant that this was a 50% cut in that year’s funding. There was some carry-forward and other budgetary manipulations, but 30% of the lab was lost, about three or four hundred if I recall. The lab tried to shield career scientists and engineers, but still many dozens were let go.
The Heilbronn Institute is the mathematical brand of the UK intelligence and spying agency GCHQ (Government Communications Headquarters). GCHQ is one of the country’s largest employers of mathematicians. And the Heilbronn Institute is now claiming to be the largest funder of “pure mathematics” in the country, largely through its many research fellowships at Bristol (where it’s based) and London.
In 2013, Edward Snowden leaked a massive archive of documents that shone a light on the hidden activities of GCHQ and its close partner, the US National Security Agency (NSA), including whole-population surveillance and deliberate stifling of peaceful activism. Much of this was carried out without the permission — or even knowledge — of the politicians who supposedly oversee them.
All this should obviously concern any mathematician with a soul, as I’ve argued. These are our major employers and funders. But you might wonder about the close-up picture. How do spy agencies such as GCHQ and the NSA work their way into academic culture? What do they do to ensure a continuing supply of mathematicians to employ, despite the suspicion with which most of us view them?
Alon Aviram of the Bristol Cable has just published an article on this, describing specific connections between GCHQ/Heilbronn and the University of Bristol — and, more broadly, academic mathematicians and computer scientists:
Alon Aviram, Bristol University working with the surveillance state. The Bristol Cable, 7 February 2017.
February 09, 2017
Cool! Fake Higgs boson news. (Or at least misleading headline.) https://t.co/RkVRwAoONB— Lisa Randall (@lirarandall) February 8, 2017
Through a digg tweet, she was referring to the article in Vice's Motherboard:
Even though there were some other reactions among Lisa's followers – not really folks who follow particle physics in most cases – and I will discuss their reactions, my response was very similar to Lisa's. The title is fake news (and the body of the article contains some diluted solution of it). Well, it is a falsehood at least to the extent that the negation of the proposition is much more true – and it is a much more important truth, too. What's going on?
By late 2011, the LHC had already accumulated a sufficient number of collisions so that the folks could analyze those with two photons at the end rather finely and they found an excess of events with two photons that seem to arise from a decay of a new boson of mass \(125\GeV\). In other words, the invariant constructed from the two photons' four-momenta \(p^\mu,q^\mu\)\[
(p_\mu+q_\mu)(p^\mu+q^\mu) = m^2
\] was apparently equal to \((125\GeV)^2\) in many more cases than for other values of \(m\). Because a Higgs boson had to be discovered for the Standard Model to be experimentally completed as a consistent theory and this mass was consistent with everything else, I told you that a Higgs of this mass was a sure thing, although some people weren't this sane yet. ;-)
Yes, the formality occurred on the Independence Day in 2012: a sufficient number allowed both ATLAS and CMS to independently claim their 5-sigma discovery. (One detector had 4.8 or 4.9 but it's really an irrelevant historical coincidence now.) The Higgs was first discovered through the decays to \(\gamma\gamma\) i.e. two photons or \(ZZ\), two Z-bosons, heavier and equally neutral electroweak siblings of the photon.
Because the mass was determined and the final state included the pairs of spin-one bosons, people could be sure about a new particle, its mass, and its interactions with the pairs of spin-one bosons. For the particle to be a Higgs boson or "the Higgs boson", it needed to have many other interactions – including those with fermion pairs – equal to the theoretically predicted value.
As the experiments at the LHC continued, they have proven that many more interactions of the new particle are exactly as strong and have exactly the same properties, up to the error margins comparable to dozens of percent, as the Standard Model predicted. So doubts that it should be called a Higgs boson gradually evaporated.
So far it really looks like it is "the Higgs boson", i.e. a Higgs boson with the exact properties determined by the Standard Model that has no extensions, cousins, heavier or charged siblings, and other things. Experiments boldly yet humbly say that the world seems boring and the theorists are 50 years ahead of the experimenters because a flawless confirmation of theories written down 50 years ago is the best thing that the experimenters may do in the 2010s (naughty teenager decade or whatever is the right name for the teens) – while they will only be able to address some ideas we may already be sure about now around 2060. ;-)
This result, "everything seems compatible with the Standard Model", may be interpreted in various ways, i.e. as an argument against specific theories that try to replace the Higgs boson with something else or modify it heavily. Such "heavily alternative" theories that nevertheless want to be compatible with the \(125\GeV\) diphoton or \(ZZ\) signal are basically known the Higgs impostor models. A man is just making fun of us, is jumping inside the detectors, and pretends that he is Peter Higgs. ;-)
OK, I need to reverse this joke because 99% of the readers who also read the mainstream media would take it literally. No, dear brainwashed readers, the impostor isn't a man. It is a particle whose behavior resembles the behavior of the Higgs boson. It may have a wrong value of the spin or parity or be composite – but the properties may accidentally resemble the CP-even, spin-zero, elementary Higgs boson we assume that the new particle actually is.
I could give you links to numerous individual papers that analyzed the LHC collisions and determined that the Higgs impostor theories are basically dead. No one can emulate Higgs so accurately. So an important conclusion from the LHC experiments – that represent dozens of hours spent by hundreds of members of the ATLAS and CMS collaborations since 2011 or 2012 – is that
The Higgs Boson Found at the Large Hadron Collider Almost Certainly Cannot Be an ‘Impostor’Now, look at the title in the Motherboard journal again:
Why the Higgs Boson Found at the Large Hadron Collider Could Be an ‘Impostor’Haven't you seen it somewhere? Yes, this title is basically exactly the converse of an important truth. It is not just a falsehood. It is a falsehood that tries to push the readers away from a proposition that is not only true but also important, a proposition that actually represents the results of lots of work that has taken place at the LHC.
Is it OK to write such falsehoods? Why has it happened?
Dr Usha Mallik, the Iowa professor, is working on a sub-detector that should be inserted somewhere to the LHC by 2023. This sub-detector could increase the abilities to detect pairs of bottom quarks which is hard, as I will mention in a minute again. And if the measurements of these decays of the Higgs to bottom quark pairs deviates from the Standard Model, and it is a big if, it could be evidence in favor of an impostor theory. But again, even if she succeeds, it's rather likely that the data will be compatible with the Standard Model just like the data collected with the existing detectors so far.
The fake news title may be "justified" as a shortcut for a more honest title such as
a female experimenter in Iowa is working on some sub-detector that will almost certainly find nothing new even after 2023 when it's installed but the work is justified by the observation that if it could find some deviation from the Standard Model, it would be evidence in favor of a Higgs impostor theory, a type of theories that seem almost dead by now, however.In other words, a lady is doing some boring stuff that won't lead anywhere. This summary doesn't sound so sexy so they have improved it and basically claimed that the Higgs that has been discovered "could" be an impostor even though the actual evidence that has been collected seems to imply exactly the opposite – that it cannot be an impostor, at least not a generic one.
Let me mention that the Higgs boson is predicted to decay to\[
h \to b\bar b
\] the pair of the bottom quark and its antiparticle in a percentage of cases. However, pairs of bottom quarks are created not only from decaying Higgs bosons but also from the "unavoidable QCD mess" at the LHC – which collides protons i.e. hadrons, messy QCD bound states. You may read about this decay channel e.g. in this CERN Courier article. The observation of the direct decay to the bottom quark-antiquark pair is hard. However, there exists a more complicated process:
It's called the vector-boson fusion, or VBF.
Two quarks from the two colliding protons emit two massive electroweak gauge bosons (W-bosons or Z-bosons) that get merged into a Higgs boson, and that Higgs boson decays to the bottom quark-antiquark pair. What is nice about this Feynman diagram is that the final bottom quarks (the middle right part of the diagram) aren't directly connected to the quarks and gluons inside the LHC protons through quark and gluon (i.e. strongly interacting) propagators.
Instead, these bottom propagators are only connected to the quarks from protons by electroweak propagators – by the Higgs and the electroweak spin-one bosons. This fact makes the central portion of the Feynman diagram clean – liberated from the pollution by the messy QCD effect. Consequently, the complicated QCD effects don't contribute so much to the background and the signal may be rather easily separated from the background. And the data already collected at the LHC are already extensive enough to bring us statistically significant evidence that the process depicted by the diagram is taking place.
The result is almost the same: when the signal is found, the interaction of the Higgs boson to the two bottoms is demonstrated. I discussed VBF in order to show that even if Usha Mallik succeeded in measuring the bottom quark decays of the Higgs using a new sub-detector, it probably wouldn't be a qualitatively new discovery that would bring the theorists new information about an interaction between elementary particles. The \(hb\bar b\) interaction may be probed differently, without the sub-detectors.
Experimental particle physicists have a rather hard job. Even relatively modest, technical advances that almost certainly won't "shift the paradigm" require years of work and sometimes billions of dollars. Even if it were discovered that the Higgs found in 2012 is an impostor, it would be something that 99.9999% of the mankind doesn't care about. For particle physicists, it could be a revolution. But this revolution is very unlikely to take place and it is arguably even less likely to take place because of Usha Mallik's work.
I know that these summaries don't sound as attractive as overhyped titles and they may be a worse starting point for Usha Mallik to get new grants etc. However, what I say is far more true and honest than what the Motherboard wrote.
It has unfortunately become a standard policy to write falsehoods and lies as titles of article and even grant applications and it's not just the very nasty people known as the journalists who are responsible for that. The scientists sometimes encourage such falsehoods themselves – and they often benefit from these lies even more than the journalists do. The ethical standards have dropped sufficiently so that Usha Mallik and others think that "it's just OK to write any lie with keywords related to my research" when her work is being described by the media. Sorry but it is not fine.
I have already mentioned that most of Lisa's followers don't have a clue. But it was one physics PhD student whose cluelessness was more visible:
He was clearly trying to chastise Lisa as if he were some moral authority. What the hell are you talking about, Christopher? Lisa hasn't tweeted any argument. She just made an observation that an article with a title that seems false to her – and to me and, more importantly, to most particle physicists, too – was published in a magazine.
The phrase "fake news" has been fashionable for several months. It's been used by the anti-Trump leftist media and it was often used inadequately. Many things labeled as "fake news" weren't fake at all – and on the contrary, it was many articles about "fake news" and especially about the origin of these alleged "fake news" (especially when the origin was claimed to be in Moscow) that were the actual "fake news".
One may say that the term "fake news" has backfired. It's been used against those who had used it for the first time. There have been various social processes like that. But at the end, "fake news" is just a currently fashionable synonym for an "untrue article in the media". There have always been untrue articles in the media and we always needed to use some words to describe "untrue articles in the media". The term "fake news" does a better job than others – it's catchy and concise enough – and it's plausible that its lifetime will be longer than we might think.
Even if she hasn't posted any long blog post substantiating her views, there is nothing "unethical" about Lisa's usage of the term "fake news" for something that she considers a falsehood published by the media. Christopher, if you think that "ethics" means the rules to avoid several phases that you have arbitrarily declared as taboos, such as the phrase "fake news", then your ethics isn't worth much.
February 08, 2017
But as with the muon tomography case, I want to highlight an important fact that many people might miss.
To address these issues of existing methods and visualize the Cs contamination, we have developed and employed an Electron-Tracking Compton Camera (ETCC). ETCCs were originally developed to observe nuclear gammas from celestial objects in MeV astronomy, but have been applied in wider fields, including medical imaging and environmental monitoring.
So now we have an example of a device that was first developed for astronomical observation, but has found applications elsewhere.
This is extremely important to keep in mind. Experimental physics often pushes the boundaries of technology. We need better detectors, more sensitive devices, better handling of huge amount of data very quickly, etc...etc. Hardware have to be developed to do all this, and the technology from these scientific experiments often trickle down other applications. Look at all of medical technology, which practically owes everything to physics.
This impact from physics must be repeated over and over again to the public, because a significant majority of them are ignorant of it. It is why I will continue to pick out application like this and highlight it in case it is missed.
February 07, 2017
New experiments will help astronomers uncover the sources that helped make the universe transparent.
When we peer through our telescopes into the cosmos, we can see stars and galaxies reaching back billions of years. This is possible only because the intergalactic medium we’re looking through is transparent. This was not always the case.
Around 380,000 years after the Big Bang came recombination, when the hot mass of particles that made up the universe cooled enough for electrons to pair with protons, forming neutral hydrogen. This brought on the dark ages, during which the neutral gas in the intergalactic medium absorbed most of the high-energy photons around it, making the universe opaque to these wavelengths of light.
Then, a few hundred million years later, new sources of energetic photons appeared, stripping hydrogen atoms of their electrons and returning them to their ionized state, ultimately allowing light to easily travel through the intergalactic medium. After this era of reionization was complete, the universe was fully transparent once again.
Physicists are using a variety of methods to search for the sources of reionization, and finding them will provide insight into the first galaxies, the structure of the early universe and possibly even the properties of dark matter.
Current research suggests that most—if not all—of the ionizing photons came from the formation of the first stars and galaxies. “The reionization process is basically a competition between the rate at which stars produce ionizing radiation and the recombination rate in the intergalactic medium,” says Brant Robertson, a theoretical astrophysicist at the University of California, Santa Cruz.
However, astronomers have yet to find these early galaxies, leaving room for other potential sources. The first stars alone may not have been enough. “There are undoubtedly other contributions, but we argue about how important those contributions are,” Robertson says.
Active galactic nuclei, or AGN, could have been a source of reionization. AGN are luminous bodies, such as quasars, that are powered by black holes and release ultraviolet radiation and X-rays. However, scientists don’t yet know how abundant these objects were in the early universe.
Another, more exotic possibility, is dark matter annihilation. In some models of dark matter, particles collide with each other, annihilating and producing matter and radiation. “If through this channel or something else we could find evidence for dark matter annihilation, that would be fantastically interesting, because it would immediately give you an estimate of the mass of the dark matter and how strongly it interacts with Standard Model particles,” says Tracy Slatyer, a particle physicist at MIT.
Dark matter annihilation and AGN may have also indirectly aided reionization by providing extra heat to the universe.
Probing the cosmic dawn
To test their theories of the course of cosmic reionization, astronomers are probing this epoch in the history of the universe using various methods including telescope observations, something called “21-centimeter cosmology” and probing the cosmic microwave background.
Astronomers have yet to find evidence of the most likely source of reionization—the earliest stars—but they’re looking.
By assessing the luminosity of the first galaxies, physicists could estimate how many ionizing photons they could have released. “[To date] there haven't been observations of the actual galaxies that are reionizing the universe—even Hubble can't deliver any of those—but the hope is that the James Webb Space Telescope can,” says John Wise, an astrophysicist at Georgia Tech.
Some of the most telling information will come from 21-centimeter cosmology, so called because it studies 21-centimeter radio waves. Neutral hydrogen gives off radio waves of this frequency, ionized hydrogen does not. Experiments such as the forthcoming Hydrogen Epoch of Reionization Array will detect neutral hydrogen using radio telescopes tuned to this frequency. This could provide clinching evidence about the sources of reionization.
“The basic idea with 21-centimeter cosmology is to not look at the galaxies themselves, but to try to make direct measurements of the intergalactic medium—the hydrogen between the galaxies,” says Adrian Liu, a Hubble fellow at UC Berkeley. “This actually lets you, in principle, directly see reionization, [by seeing how] it affects the intergalactic medium.”
By locating where the universe is ionized and where it is not, astronomers can create a map of how neutral hydrogen is distributed in the early universe. “If galaxies are doing it, then you would have ionized bubbles [around them]. If it is dark matter—dark matter is everywhere—so you're ionizing everywhere, rather than having bubbles of ionizing gas,” says Steven Furlanetto, a theoretical astrophysicist at the University of California, Los Angeles.
Physicists can also learn about sources of reionization by studying the cosmic microwave background, or CMB.
When an atom is ionized, the electron that is released scatters and disrupts the CMB. Physicists can use this information to determine when reionization happened and put constraints on how many photons were needed to complete the process.
For example, physicists reported last year that data released from the Planck satellite was able to lower its estimate of how much ionization was caused by sources other than galaxies. “Just because you could potentially explain it with star-forming galaxies, it doesn't mean that something else isn't lurking in the data,” Slatyer says. “We are hopefully going to get much better measurements of the reionization epoch using experiments like the 21-centimeter observations.”
It is still too early to rule out alternative explanations for the sources of reionization, since astronomers are still at the beginning of uncovering this era in the history of our universe, Liu says. “I would say that one of the most fun things about working in this field is that we don't know exactly what happened.”
It appears that this black hole has been slowly feasting on this dead star for at least a decade. Ouch!
"We have witnessed a star's spectacular and prolonged demise," said Dacheng Lin, a research scientist at UNH's Space Science Center and the study's lead author. "Dozens of these so-called tidal disruption events have been detected since the 1990s, but none that remained bright for nearly as long as this one."
The arXiv version of this paper can be found here.
Moral of the story: Never piss off a black hole!