Particle Physics Planet


August 19, 2019

Christian P. Robert - xi'an's og

el lector de cadaveres [book review]

lelecteur-decadavres El lector de cadaveres (the corpse reader) by Antonio Garrido (from Valencià) is an historical novel I picked before departing to Japan as the cover reminded me of van Gulik’s Judge Dee which I very enjoy9788467013849ed (until a terrible movie came out!). Although van Gulik apparently took the idea from a 18th-century Chinese detective crime novel Di Gong An. Anyway, this seemed like a good travel book, although heavy in my hiking backpack. After reading it mostly in the trains, I am not so convinced that it was a great idea! The story stemmed from the author attending a legal medecine conference in Mumbai and hearing of a 13th century Chinese medical expert who could be the very first forensic doctor. The book builds upon this historical character, romancing his early life from destitute to becoming the legal medicine expert of the Song emperor. There are a lot of similarities with Judge Dee in that the meritocratic structure of the Chinese government is central to the central character joining the academy and postulating for the imperial examinations. That the underworld is never far from the ruling classes. That superstition is also a permanent feature in everyday’s life. That cruelty is a part of justice as well as an intricate legal code. And that confucianism is strictly ruling the society, from top to bottom. The historical part is rather nice, with a higher degree of details and apparent authenticity than van Gulik’s. This shows the research undertaken by the author was quite fruitful. The plot however is terrible, with Song Ci falling in every possible trap, trusting every villain in the vicinity and shooting himself in the foot at every occasion. The story goes from one disaster to the next, Ci being only saved by a last minute benevolent passerby. His brilliance as a forensic officer is hard to explain when considering the stupidity he demonstrates all along, while the novel involves several others who also work on the medical analysis of corpses for the courts. A very lengthy suspension of belief!

by xi'an at August 19, 2019 10:19 PM

Peter Coles - In the Dark

A Reminiscence of Cricket

W.G. Grace, photographed in 1902

Not a lot of people know that Sir Arthur Conan Doyle, the creator of Sherlock Holmes, was a keen amateur cricketer who played ten first-class matches (for the MCC). He was an occasional bowler who only took one wicket in a first-class game, that of W.G. Grace, which was such a momentous event for him that he wrote this poem about it:

Once in my heyday of cricket,
One day I shall ever recall!
I captured that glorious wicket,
The greatest, the grandest of all.

Before me he stands like a vision,
Bearded and burly and brown,
A smile of good humoured derision
As he waits for the first to come down.

A statue from Thebes or from Knossos,
A Hercules shrouded in white,
Assyrian bull-like colossus,
He stands in his might.

With the beard of a Goth or a Vandal,
His bat hanging ready and free,
His great hairy hands on the handle,
And his menacing eyes upon me.

And I – I had tricks for the rabbits,
The feeble of mind or eye,
I could see all the duffer’s bad habits
And where his ruin might lie.

The capture of such might elate one,
But it seemed like one horrible jest
That I should serve tosh to the great one,
Who had broken the hearts of the best.

Well, here goes! Good Lord, what a rotter!
Such a sitter as never was dreamt;
It was clay in the hands of the potter,
But he tapped it with quiet contempt.

The second was better – a leetle;
It was low, but was nearly long-hop;
As the housemaid comes down on the beetle
So down came the bat with a chop.

He was sizing me up with some wonder,
My broken-kneed action and ways;
I could see the grim menace from under
The striped peak that shaded his gaze.

The third was a gift or it looked it-
A foot off the wicket or so;
His huge figure swooped as he hooked it,
His great body swung to the blow.

Still when my dreams are night-marish,
I picture that terrible smite,
It was meant for a neighboring parish,
Or any place out of sight.

But – yes, there’s a but to the story –
The blade swished a trifle too low;
Oh wonder, and vision of glory!
It was up like a shaft from a bow.

Up, up like a towering game bird,
Up, up to a speck in the blue,
And then coming down like the same bird,
Dead straight on the line that it flew.

Good Lord, it was mine! Such a soarer
Would call for a safe pair of hands;
None safer than Derbyshire Storer,
And there, face uplifted, he stands

Wicket keep Storer, the knowing,
Wary and steady of nerve,
Watching it falling and growing
Marking the pace and curve.

I stood with my two eyes fixed on it,
Paralysed, helpless, inert;
There was ‘plunk’ as the gloves shut upon it,
And he cuddled it up to his shirt.

Out – beyond question or wrangle!
Homeward he lurched to his lunch!
His bat was tucked up at an angle,
His great shoulders curved to a hunch.

Walking he rumbled and grumbled,
Scolding himself and not me;
One glove was off, and he fumbled,
Twisting the other hand free

Did I give Storer the credit
The thanks he so splendidly earned?
It was mere empty talk if I said it,
For Grace had already returned.

by Sir Arthur Conan Doyle (1859-1930).

 

 

by telescoper at August 19, 2019 06:24 PM

Dmitry Podolsky - NEQNET: Non-equilibrium Phenomena

5 Marketing Mistakes Franchisors Should Watch Out For

After finalizing your franchise plans and models, packages, and opportunities, the next step is selling your franchise. But as with all marketing activities, selling your franchise can be quite costly and can take up a lot of resources, which is why you’d want to make sure that every marketing activity and resource you invest in has a good chance of attracting potential franchisees. However, it’s important for franchisors to be aware of common marketing mistakes that won’t only end up costing you money but may end up pushing away potential franchisees.

Relying Too Much on Hard Selling

It’s indeed a huge plus to have aggressive franchise brokers and franchise sales agents/team, but you shouldn’t only rely on them. Many potential franchisees don’t want to be “pushed” or “persuaded” to buy a franchise but prefer to have all the information they need so that they, themselves, can decide whether or not to buy your franchise. That said, you should be able to provide this information in your website, and perhaps even in the franchise opportunity brochures and portfolios (both digital and printed copies) that your franchise brokers and sales team can distribute to potential and/or interested franchisees.

Lacking Proof and Feedback

People discussing about business and sales

Even if your business is already well-known and has proven itself to be quite lucrative, you’d have to think like your potential franchisees. Both seasoned businessmen and those who wish to start a business are aware that buying a franchise is a huge decision that requires a lot of money, which is why they’d want to minimize risk as much as possible by having proof that the franchise they’re buying is indeed a safe and lucrative investment. As such, you should be able to provide data showing all the information they need such as ROI and forecasted sales, as well as data and even testimonials/feedback from your successful franchisees.

Not Having A Dedicated Franchising Website

If you’re already opening your business to franchising, chances are, you already have a business website with all the products and information about your business. However, a common mistake for franchisors is having their “franchise page” as just a sub-page on their business website. As mentioned earlier, many potential franchisees prefer to get all the information they need to decide whether or not to buy the franchise, instead of having it “sold” to them. As such, one of the best things you can do when selling your business franchise is having an entirely separate website dedicated for franchising which contains all the information they need, and perhaps even an active online customer service representative that they can chat with if they wish to know more about the franchise opportunities you’re selling.

Unclear or Poorly-Written/Made Reading Materials

Providing information to your potential franchisees is something that can’t be stressed enough. However, it’s vital for you to oversee and ensure that the instructional materials and franchise brochures/portfolios (including those in your website and social media) are done concisely and could be understood easily to minimize any confusion — your potential clients are more likely to hesitate if the information you’re showing isn’t clear or transparent, or are simply poorly-worded.

Neglecting Social Media

Woman using her phone

While you may have invested in print advertisements, active and professional sales agents, and maybe even TV/radio ads, you should give as much focus to online digital marketing, specifically, social media. Many consumers and entrepreneurs search for products, services, and even business and investment opportunities through social media. That said, you should focus on social media marketing for business franchises; this includes content creation and also managing your franchising business’ social media accounts. One should also have a social media account manager to handle all the inquiries on comments and private messages from interested or curious parties.

Conclusion

Opening your business for franchise opportunities is a good sign that it’s grown big enough and has made a name for itself. But in order to boost your chance of selling your franchise, you should definitely watch out for these common franchise marketing mistakes.

The post 5 Marketing Mistakes Franchisors Should Watch Out For appeared first on None Equilibrium.

by Bertram Mortensen at August 19, 2019 06:03 AM

August 18, 2019

Christian P. Robert - xi'an's og

hiking the Kumano Kodo Nakahechi imperial route

The Kumano Kodo is a network of paths of pilgrimage towards places seen as sacred by either buddhists or shintoists (or syncretists!) from the 700’s. (The Kumano Hongu Taisha shrine celebrated its 2050th anniversary last year!) Meaning for non-believers a well-established system of ancient hiking paths in the mountainous forests of the Kii peninsula, south of Osaka, Kyoto and Nara. Apart from the potential dangers of heavy rain, like massive mudslides, there is no particular difficulty in the hikes (which were done wearing braided straw shoes, the waraji sandals) and a dense network of guest-houses and bus routes makes it possible to adapt the length of the day trips to one’s speed. While this is the place of Japan with the maximum of rain, the hot temperatures actually make it more than bearable when it is a shower and not a typhoon!day one

14km, from Takijiri-oji to Chikatsuyu-Oui, 7:30 hours, 1100 positive gain, nice trail all the way, maximum temperature 32°, met two dozen hikers and two trail runners, got tired by the end, confused one minshuku guest house for another, the hosts of the first one drove us to the second and later brought me back my sandals I have left in their car, no English speaking at the second place and no Onsen but very nice bento box and a pleasant conversation with a couple of Ausso-Danish young women spending a month in Japan and four on the trail. Entire path under cover, making high heat so much more bearable! Not too much in terms of views despite following ridges and visiting tops. Plenty of huge butterflies.

day two

6km, from Doguyaba-bashi (reached by bus) back to Tsugizakura, 2:30-3:00 hours, maximum temperature 30⁰, mixed trail and road, rain spells and showers, met three other couples of hikers, arrived too early at the guest house, waited in a one-room thatched traditional tea house by the side of the road with free hōji-cha tea offered by a very nice old man (not English speaking, alas, which limited the exchanges), reached the guest house at the same time as a Canadian-Spanish couple embarked on a massive six month travel. The dinner was imperial with a whooping ten dishes, all delicate and beautiful. Plus getting our names in Japanese Kanji characters.

day three

9km, from Doguyaba-bashi (the host gave us a ride) to Hosshinmon-oji, 4 hours, 600 positive gain, maximum temperature 29⁰, heavier rain spells, first part really nice then switching to forest track due to closures, finishing detour on tar road, took the infrequent bus to Yunomine-Onsen due to the closure. A few more couples and lone hikers passed us. Stayed at a larger guest-house minshuku with its own natural onsen plus normally access to the river with swimming waterholes. Far from Hongu and Yunomine, alas. But also welcome place to wait two days for the typhoon Krosa to pass by Western Japan.


day four

14km, from Nachisan to Koguchi, 930 positive gain and 1260 negative gain!, alas cancelled as advised by local tourist office due to uncertainties on the state of the trail, took the bus instead and enjoyed a swim after a very hot day in the now quiet Kumanogawachonichi, next to a most congenial guesthouse. Very quiet place, superlative food!


day five

13km, from Koguchi to Ukegawa,  690 positive gain, 5:30 hours, maximal temperature 29⁰, sunny, very enjoyable part of the trail with many ridge walks, no road share, hardly any trace of the typhoon, and early arrival in Hongu. Met with a guest from the Yunomine-Onsen guest house hiking the other way. And very few others.

by xi'an at August 18, 2019 10:19 PM

Christian P. Robert - xi'an's og

ZapperZ - Physics and Physicists

Big Bang Disproved?!
Of course not! But this is still a fun video for you to watch, especially if you are not up to speed on (i) how we know that the universe is expanding and (ii) the current discrepancy in the measurement of the Hubble constant via two different methods.



But unlike politics or social interactions, discrepancies and disagreement in science are actually welcomed and a fundamental aspects of scientific progress. It is how we refine and polish our knowledge into a more accurate form. As Don Lincoln says at the end of the video, scientists love discrepancies. It means that there are more things that we don't know, and more opportunities to learn and discover something new.

Zz.

by ZapperZ (noreply@blogger.com) at August 18, 2019 04:13 PM

Peter Coles - In the Dark

Sixty Years of Kind of Blue

I didn’t remember until late last night that yesterday was the 60th anniversary of the release, on 17th August 1959, of the classic jazz album Kind of Blue by a band led by trumpeter Miles Davis featuring John Coltrane (ts), Cannonball Adderley (as), Bill Evans (p, replaced by Wynton Kelly on one track), Paul Chambers (b) and Jimmy Cobb on drums. I bought the album on vinyl way back in the 1970s when I was still at school and have listened to it probably thousands of times since then. It still sounds fresh and exciting sixty years after its first release. But you don’t have to listen to me, you can listen to the whole album here:

When it first appeared, Kind of Blue seemed to represent all that Miles Davis stood for from a musical point of view, with its modal and scalar themes and such passages as the fourth section of Flamenco Sketches which hints at a Spanish influence. Whether the actual performances were typical of the way this band sounded live is less clear, but there’s no question that the album has worn so well as to be now universally regarded as a timeless masterpiece.

So why is it such an important album?

I can only speak for myself, of course, but I’d say a big part of this was that the music is on the cusp of the evolution of modern jazz. It’s music from a time of transition, pointing the way forward to exciting developments while also acknowledging past traditions. You only have to look at the various directions Miles Davis, John Coltrane and Bill Evans explored after this album to see what I mean.

A few words about each of the tracks:

The opening number So What? established the practice of constructing themes based on scales instead of chords. After an introduction that keeps you guessing for a while, it turns out to be a straightforward 32-bar melody with a simple modulation serving as the bridge. At a medium tempo, the pure-toned and rather spare solo by Miles Davis provides a delicious contracts with the flurry of notes produced by Coltrane, who also plays between the beats. It might be just my imagination but the rhythm section seems to tighten up behind him, only to relax again with Cannonball Adderley’s more laid-back, bluesy approach.

The next track is All Blues, which is in a gentle 6/8 time. I discovered by accident a while ago this composition found its way onto the GCSE Music syllabus. In fact there’s a recording of the track, produced and distributed as “set work” for that purpose:

As an aside, I should mention that I never took any qualifications in music at School – although I did get music lessons, I didn’t find them at all inspiring and it took me years to develop a taste for anything other than Jazz, which I knew about mainly from home, because my father was a (part-time) Jazz drummer. There wasn’t much mention of Jazz at School from teachers, and none of my friends were into it, so it became a very private passion, although I’m glad to say it never faded.

Anyway, what little I know about music I picked up by studying on my own, and trying to figure out what was going on by listening to records. All Blues is a really interesting composition to unpick in this way, as it tells you a lot about how Jazz was evolving in the late 1950s (it was released in 1959). Musicians like Miles Davis were experimenting with ways of breaking away from the standard approach to Jazz improvisation based on chord progressions, and one of the routes that developed was modal Jazz. All Blues is particularly interesting because it teeters on the edge between the old approach and the new; it’s clearly based on the traditional 12-bar blues progression but diverges from it in several respects.

A standard blues progression in G might go like this (although there are many variations):

|G|G|G|G|
|C|C|G|G|
|D|C|G|G|

It’s based on just three chords: the tonic (in this case G): the sub-dominant IV (C) and the dominant V (D); the V-IV-I progression in the last four bars is usually called the turnaround.

The progression for All Blues is this:

|G7| G7| G7| G7|
|Gm7| Gm7| G7| G7|
|D7| E♭7 D7| F G|F G6|

While the addition of a major 7th note to the basic triad G isn’t unusual, the two G minor 7th chords are more interesting, because they involve adding a blue note (a flattened third) to the basic chord . But it’s in the last four bars that the harmonies move dramatically away from the standard turnaround. Chromatic chords are included and the usual resolution back to G is subtly changed by the addition of a 6th note (E) to the basic G chord (GBD) at the end; that trick became a bit of a trademark for Jazz of this period.

However, it’s the two F chords that represent the strongest connection with modal harmony. The scale of G major involves F-sharp, so the F is a flattened note (a flattened VIIth). In fact, all the Fs in the piece are natural rather than sharp. For this reason you could argue that this is a piece not played in the key of G major but in the corresponding Mixolydian mode (the white notes on the piano from G to G).

So it’s a blues that’s not quite a blues, but is (appropriately enough) Kind of Blue. There’s so much going on harmonically that the fact that it’s played in 6/8 rhythm (rather than the more usual 4/4 for the Blues) seems almost irrelevant.

Those are just the bare bones, but the improvisations of Miles Davis, Bill Evans, Cannonball Adderley and John Coltrane et al breathe life into them and create a living Jazz masterpiece. Although it seems like a complicated tune, apparently what happened at the recording session was that Miles Davis talked the band through the piece, they played it once to get a feel for it, and then recorded the entire track that was released on the album in one go.

On Freddie Freeloader , Bill Evans was replaced with Wynton Kelly. I suppose that Miles Davis thought that Kelly would be more convincing on this relatively straight-ahead blues, and his crisp, direct opening solo suggests that Miles was probably right. Miles Davis’s solo that follows is superbly structured in terms of timing and dynamics. Coltrane plays more-or-less entirely in double-time and then Adderley enjoys himself hugely in a good-humoured final solo.

Blue in Green, which was mainly written by Bill Evans, is based on a ten-bar melody featuring an eloquent solo Miles on muted trumpet and some sensitive playing by Coltrane and Evans. The same mood prevails in the following track.

Flamenco Sketches involves a series of solos each improvised on a set of five scales; it’s the fourth section that hints at the Spanish influence alluded to in the title. The tempo is very slow, which contributes the air of solemnity as does the absolute perfection of the solos. In that respect it has clear parallels with some of Duke Ellington’s work. Miles Davis, who opens and closes the track on muted trumpet, and Bill Evans on piano are absolutely faultless but I particularly enjoy John Coltrane’s playing on tenor saxophone: his tone is as bleak and austere as an Arctic sunrise, and just as wonderful and he conjures up an absolutely beautiful improvised melody.

I’ll end with a comment on the album Kind of Blue, by Stephen Thomas Erlewine who wrote

Kind of Blue isn’t merely an artistic highlight for Miles Davis, it’s an album that towers above its peers, a record generally considered as the definitive jazz album, a universally acknowledged standard of excellence. Why does Kind of Blue posses such a mystique? Perhaps because this music never flaunts its genius… It’s the pinnacle of modal jazz — tonality and solos build from the overall key, not chord changes, giving the music a subtly shifting quality… It may be a stretch to say that if you don’t like Kind of Blue, you don’t like jazz — but it’s hard to imagine it as anything other than a cornerstone of any jazz collection.

People sometimes ask me why I post about music on here. The answer has two parts and they’re both simple. One is that I enjoy writing about music because it gives me the opportunity to explore my own thoughts about why I like it so much. The other reason is to share something I love very much, in the hope that other people might find as much joy from the music I love. For example, if just one person listens to Kind of Blue for the first time as a result of reading this piece, then it will definitely be worth the 40 minutes it took me to write!

by telescoper at August 18, 2019 01:41 PM

August 17, 2019

Peter Coles - In the Dark

Black Cat Appreciation Day

Apparently today, August 17th, is Black Cat Appreciation Day, so I couldn’t resist acknowledging the contributions of Maynooth University Library Cat who I think is largely responsible for increase in the number of students coming to this University.

Unfortunately he hasn’t been appreciating the rainy weather over the past few days..

by telescoper at August 17, 2019 04:08 PM

August 16, 2019

Christian P. Robert - xi'an's og

deadlines for BayesComp’2020

While I have forgotten to send a reminder that August 15 was the first deadline of BayesComp 2020 for the early registrations, here are further deadlines and dates

  1. BayesComp 2020 occurs on January 7-10 2020 in Gainesville, Florida, USA
  2. Registration is open with regular rates till October 14, 2019
  3. Deadline for submission of poster proposals is December 15, 2019
  4. Deadline for travel support applications is September 20, 2019
  5. There are four free tutorials on January 7, 2020, related with Stan, NIMBLE, SAS, and AutoStat

by xi'an at August 16, 2019 10:19 PM

Peter Coles - In the Dark

Maynooth Offers

Well I’ve had a busy week here in Maynooth marking and checking repeat examinations (just finished this morning) during which from time to time I’ve been keeping an eye on things to do with students admissions for the forthcoming year, both here and in other institutions across Ireland. Universities and students received their Leaving Certificate results earlier in the week, but institutions then had a couple of days to decide on the basis of course capacity and the results obtained which students would receive offers of a place on which courses. This is usually expressed in terms of a points total: the more popular the course, and the better the results for applicants to that course, the higher the points required would be. Yesterday first-round offers went out from CAO across the country – there’s a summary in the Irish Times. Students who don’t get an offer from their first choice course can try in subsequent rounds to get a place at another institute.

As of yesterday afternoon, Maynooth University is expecting to admit 3,225 new first year students this year. This is the largest ever intake for the university and represents an increase of 3% from last year. This growth reflects a strong demand for places: more than 4,200 students chose Maynooth University as their first preference, an increase of 7% from last year (which I mentioned earlier this year).

At the moment it looks like being a particularly good year for our BSc Course in Theoretical Physics and Mathematics, but I’d rather wait until the process is over and numbers are confirmed before commenting further.

Anyway, as the CAO process is ongoing, I thought I’d include this little video about what Maynooth has to offer undergraduate students with particular emphasis on the flexibility of its programmes whether they be in Arts & Humanities or Sciences. I wrote about the advantages of the `Omnibus Science’ programme here. If you are reading this and didn’t happen to get the points for your first-choice course then you could do a lot worse than consider Maynooth!

by telescoper at August 16, 2019 02:33 PM

Dmitry Podolsky - NEQNET: Non-equilibrium Phenomena

Keep the Heat Up in Your Oven

Industrial equipment can be sturdy, but it is not unbreakable. It will encounter problems if you don’t maintain it. This is why you need to maintain it. One of the more important pieces of equipment is your industrial oven. Considering how it blasts incredible amounts of heat into a small space, it is surprising that it doesn’t break down more often. With the following tips, you can ensure that problems with it will be few and far between:

Lubricate the Blower

An essential part of the oven, the blower motor supplies the needed air for the oven. Without oxygen, there is no fire, so keeping it running is important. Lubrication is critical here. Some models don’t need lubrication but need regular cleaning. Regular lubrication every six months ensures that the blower will keep doing its job. Check the manual for how you can lubricate the motor so that you can ensure that there is nothing wrong. With regular cleaning and lubrication, you can prevent sudden breakdowns. These can ruin your oven’s performance and productivity.

Maintain the Airflow

If you want your blower motor to work well, nothing must restrict the airflow to the oven. Do this by positioning the oven so that the blower motor’s air inlets are clear. Placing items around the inlets is a bad idea. Clearing the space around the oven to ensure maximum air consumption is possible.

Use the Right Voltage

Closeup photo of a stainless steel appliances in modern residential kitchen

Industrial ovens consume great amounts of electricity to function well. If they are not supplied with the right power, they will not perform well. For example, if you hook up a 240 VAC machine up to a weaker power supply like 200 VAC, you can expect lower performance. Always plug your machines into the proper power supply for the best results.

Check the Inside

There are several components that you will need to check every few months. One of these is the heating elements. They heat things in the oven, and when they malfunction, you will not get the high temperatures that you need. If they do break, replace them. Do the same for other components like the thermal sensor and wiring.

Know When to Replace

Even the sturdiest items break. If your oven is old, then you will need to replace it. Any model over ten years old will not have the innovations of new models. This is when you should start shopping around. Suppliers like SupaGEEK Designs can offer your company a great new oven that will meet your requirements. You will have to consider whether it is ideal to keep your old oven rather than buy a new one and enjoy the benefits.

Your industrial oven can cause delays down the line. This is bad news for your company’s productivity and bottom line. This is why the tips above are so important. With their help, you can be sure that breakdowns are at a minimum and that your oven performs well every day. The results are better productivity and a bump in profits.

The post Keep the Heat Up in Your Oven appeared first on None Equilibrium.

by Bertram Mortensen at August 16, 2019 02:36 AM

August 15, 2019

Jon Butterworth - Life and Physics

Nature Careers: Working Scientist podcast
I talked to Julie Gould for Nature recently, about the challenges of working on big collaborations, of doing physics in the media spotlight, on why LHC had more impact with the public than LEP, and more. (I also occasionally manage … Continue reading

by Jon Butterworth at August 15, 2019 06:19 PM

Peter Coles - In the Dark

What The World Needs Now

I’ve always been a not-so-secret admirer of American songwriter and record producer Burt Bacharach, but when someone told me the other day that there’s an album called Blue Note Plays Burt Bacharach I assumed it was a wind up because Blue Note Records has for many years been an uncompromising voice at the cutting edge of modern jazz rather than the lighter and more popular form of music exemplified my Mr B.

There’s no reason why two forms of excellence can’t exist together, however, and the album is definitely real and is a very nice compilation of Bacharach numbers from Blue Note albums featuring various musicians over the years. Here’s an example featuring Stanley Turrentine on tenor sax, with McCoy Tyner on piano, Bob Cranshaw on bass and Micky Roker on drums. The tune is What The World Needs Now Is Love. Doesn’t it just?

by telescoper at August 15, 2019 06:12 PM

Emily Lakdawalla - The Planetary Society Blog

OSIRIS-REx Team Picks 4 Candidate Sample Sites on Asteroid Bennu
The sites each have unique characteristics that would advance the field of asteroid science.

August 15, 2019 11:28 AM

John Baez - Azimuth

Carbon Offsets

A friend asks:

A quick question: if somebody wants to donate money to reduce his or her carbon footprint, which org(s) would you recommend that he or she donate to?

Do you have a good answer to this? I don’t want answers that deny the premise. We’re assuming someone wants to donate money to reduce his or her carbon footprint, and choosing an organization based on this. We’re not comparing this against other activities, like cutting personal carbon emissions or voting for politicians who want to cut carbon emissions.

Here’s my best answer so far:

The Gold Standard Foundation is one organization that tackles my friend’s question. See for example:

• Gold Standard, Offset your emissions.

Here they list various ways to offset your carbon emissions, currently with prices between $11 and $18 per tonne.

The Gold Standard Foundation is a non-profit foundation headquartered in Geneva that tries to ensure that carbon credits are real and verifiable and that projects make measurable contributions to sustainable development.

by John Baez at August 15, 2019 05:34 AM

August 14, 2019

ZapperZ - Physics and Physicists

Relativisitic Length Contraction Is Not So Simple To See
OK, I actually had fun reading this article, mainly because it opened up a topic that I was only barely aware of. This Physics World article describes the simple issue of length contraction, but then delves into why OBSERVING this effect, such as with our own eyes, is not so simple.

If the Starship Enterprise dipped into the Earth’s atmosphere at a sub-warp speed, would we see it? And if the craft were visible, would it look like the object we’re familiar with from TV, with its saucer section and two nacelles? Well, if the Enterprise were travelling fast enough, then – bright physicists that we are – we’d expect the craft to experience the length contraction dictated by special relativity.

According to this famous principle, a body moving relative to an observer will appear slightly shorter in the direction the body’s travelling in. Specifically, its observed length will have been reduced by the Lorentz factor (1–v2/c2)1/2, where v is the relative velocity of the moving object and c is the speed of light in a vacuum. However, the Enterprise won’t be seen as shorter despite zipping along so fast. In fact, it will appear to be the same length, but rotated.

You might not have heard of this phenomenon before, but it’s often called the “Terrell effect” or “Terrell rotation”. It’s named after James Terrell – a physicist at the Los Alamos National Laboratory in the US, who first came up with the idea in 1957. The apparent rotation of an object moving near the speed of light is, in essence, a consequence of the time it takes light rays to travel from various points on the moving body to an observer’s eyes.
You can read the rest of the explanation and graphics in the article. Again, this is not to say that your "pole-in-barn" exercise that you did in relativity lessons is not valid. It is just that in that case, you were not asked what you actually SEE with your eyes when that pole is passing through the barn, and that your pole is long and thin, as opposed to an object with a substantial size and width. The notion that such object will be seen with our eyes flat as a pancake is arguably may not be true here.

Zz.

by ZapperZ (noreply@blogger.com) at August 14, 2019 04:17 PM

Emily Lakdawalla - The Planetary Society Blog

How to See LightSail 2 in the Night Sky
If your latitude is within 42 degrees of the equator, there's a chance you may be able to spot LightSail 2's reflective solar sail.

August 14, 2019 11:00 AM

Lubos Motl - string vacua and pheno

Coincidences, naturalness, and Epstein's death
The circumstances of Jeffrey Epstein's death seem to be a drastic but wonderful metaphor for naturalness in physics: those who say "there is nothing to see here" in the irregularities plaguing Epstein's jail seem to be similar to those who say "there is nothing to see here" when it comes to fine-tuning or unlikely choices of parameters in particle physics.

As far as I can say, a rational person who thinks about these Epstein events proceeds as follows:
  • an invention of rough hypotheses or classes of hypotheses
  • usage of known or almost known facts to adjust the probabilities of each hypothesis
It's called logical or Bayesian inference! That's a pretty much rigorous approach justified by basic probability calculus – which is just a continuous generalization of mathematical logic. The opponents of this method seem to prefer a different Al Gore rhythm:
  • choose the winning explanation at the very beginning, according to some very simple e.g. ideological criteria or according to your own interests; typically, the winning explanation is the most politically correct one
  • rationalize the choice by saying that all other possible explanations are hoaxes, conspiracy theories, "not even wrong" theories that are simultaneously unfalsifiable and already falsified, and by screaming at, accusing, and insulting those who argue that their other choices seem more likely – often those who do some really fine research
Which of the approaches is more promising as a path towards the truth? Which is the more honest one? These are rhetorical questions – of course Bayesian inference is the promising and ethical approach while the other one is a sign of stupidity or dishonesty. I am just listing the "second approach" to emphasize that some people are just dumb or dishonest – while they or others often fail to appreciate this stupidity or dishonesty.



OK, the basic possible explanations of the reported death seem to be the following:
  1. Epstein committed suicide and all the "awkward coincidences" are really just coincidences that don't mean anything
  2. Epstein committed suicide and someone helped to enable this act, perhaps because of compassion
  3. Epstein was killed by somebody and it's accidentally hard to determine who was the killer because the cameras etc. failed to do their job
  4. Epstein was killed by somebody who took care of details and most of these coincidences are issues that the killer had to take care of
  5. Epstein is alive – he was probably transferred somewhere and will be allowed a plastic surgery and new identity
I have ordered the stories in a certain way – perhaps from the most "politically correct" to the most "conspiracy theory-like" explanations. I had to order them in some way. Also, some completely different explanation could be completely missing in my list – but at some level, it should be possible to group the explanations to boxes according to Yes/No answers to well-defined questions which means that there is a semi-reliable way to make sure that you won't miss any option.



OK, I think that there are lots of politically correct, basically brainwashed and brain-dead, people who imagine a similar list, order it similarly, and pick the first choice – the most politically correct choice – because it's what makes them feel good, obedient, and it's right according to them. They may have been trained to think that it's ethical if not morally superior to believe the first explanation according to a similar ordering.

And then there is the rest of us, the rational people who realize that the most politically correct explanation is often false and one should treat the explanations fairly and impartially, regardless of whether they sound politically correct or convenient for certain people etc.

In the absence of special data and circumstances, the rational people among us also favor the "least conspirational" explanation – well, the most likely one. However, it isn't necessarily the "most politically correct" one in general. Also, the fact that we try to favor the "most likely" explanation is a tautology – it's the task we are solving from the beginning.

But in this case, and many others, there are lots of special facts that seem to matter and affect the probabilities. In this case, and quite generally, they just make the "conspiracy-like explanations" more likely. (A much more detailed analysis should be written to clarify which hypotheses are strengthened by which special circumstances.) In this Epstein story, they are e.g. the following:
  1. Epstein was on suicide watch just three weeks ago but he was taken from the suicide watch days before he was found dead
  2. Epstein has previously claimed that someone tried to kill him in jail
  3. the cameras that could watch him were looking the other way for a very long time – a fact that may clearly be counted as a case of malfunctioning camera (and Polymath is just batšit crazy when he claims that a camera looking the other way, away from Epstein, for hours (?) is not malfunctioning)
  4. Epstein's cellmate was transferred hours before Epstein's death (a possible witness)
  5. the cellmate was taken out from a cell that has a bunk bed (double decker) which is probably needed for a suicide claim (but the very presence of a bunk bed increases the probability of the suicide option 1, too)
  6. he should have been checked every 30 minutes but around the death, the protocol was violated for hours
  7. one of the two relevant guards wasn't a corrections officer but a more unrelated employee
  8. he was claimed to hang himself using bed sheets but the sheets should have been made of paper and the bed frame was unmovable while the room was 8-9+ feet high
  9. a new huge batch of documents about the ring was released by court a day before his death
  10. the number of people who had the motive to kill Epstein was huge – and their combined power is even greater because they were usually rich and high-profile people (note that I don't make any claim about whether the potential killer was left-wing or right-wing – people in both camps speculate but the left-wing killers seem more likely because they were more connected with Epstein and apparently more sinful)
And I am pretty much certain that this list is incomplete, even when it comes to coincidences that have really shocked me. I tried to add some hyperlinks (sources) to the list above but there's no objective way to determine what is the "best" source. Most of these things simply look credible. Some of them really look implicitly "proven". If there were a good camera recording of his suicide, we would have probably learned about it, right?

So I think it's just OK to list similar coincidences even without other "sources". In my case, they are a result of my research and careful curation of sources. I am proud of offering occasional investigative stories that are both more accurate and more early than elsewhere. So if someone suggests that I should be just a follower who copies some MSNBC articles, I feel incredibly insulted because TRF is obviously better, more accurate, and more groundbreaking than the MSNBC. If you really disagree with such a claim, then it would be sensible for you to avoid my website altogether, wouldn't it?

At any rate, there is a very large number of "coincidences" that are generally increasing the probability of the more "conspiracy-like" explanations. Everyone who doesn't acknowledge this fact is a brainwashed or brain-dead irrational moron, a stupid sheep that might be used for wool but not for thinking. The event may still turn out to be a suicide and the coincidences may be just coincidences. But even if that is the case, it will still be true that the people who accept this conclusion immediately are either stupid or dishonest – or perhaps even involved in the plan.

A broken clock is correct twice a day. A wrong reasoning may sometimes end up with a conclusion that happens to be right, too. But even when it is so, we can still analyze how the reasoning was made and if it is demonstrably fallacious or stupid, it can be demonstrated that it is fallacious or stupid – and that the person reasoning in this way is analogous to the broken clock.

Now the analogy. You have the people who won't ever acknowledge any arguments involving fine-tuning or naturalness or the preference for theories that just look more solid, less contrived etc. Like in the Epstein case, these people find their winning explanation in advance, i.e. by ignoring all the relevant detailed evidence that may be collected later. And then they just rationalize this explanation and spit on the alternatives and everyone who "dares" to defend them.

So these people may decide that the best theory is a "quantum field theory with the smallest number of component fields" – their form of Occam's razor. Supergravity or string theory "add fields", according to their counting, so they are less compatible with this version of Occam's razor, and therefore they eliminate these theories even though they don't have any negative evidence.

But competent physicists don't think like that. The claim that a "field theory with the smallest number of fields is most likely" is just a hypothesis and there is an extremely strong body of evidence – both anecdotal empirical evidence and theoretical evidence in the form of incomplete but nearly mathematical proofs – that this assumption is incorrect. Competent physicists really know that the relevant realization of Occam's razor is different and when some multiplets (or supermultiplets) of fields are guaranteed to exist by a symmetry principle or another qualitative principle, they cannot be counted as a disadvantage of the theory that makes them unlikely, despite the fact that the number of component fields may grow very high.

So once again, competent physicists are actually doing something that is analogous to the rational people who care about the peculiarities involving Epstein's guards, documents, camera, and cell maintenance. They just work with the evidence in a nontrivial way – with lots of evidence. The rational usage changes the odds of various theories and even classes of theories. In particular, people have learned that theories with greater numbers of component fields implied by powerful enough symmetry principles (or similar principles) seem like the more natural, default, apparently more likely hypothesis than the naive theory with the smallest number of component fields.

Both in the case of particle physics and Epstein's death, there simply exist two groups of people. One of them prefers an impartial treatment of the hypothesis and relentless, rigorous work with the detailed evidence and its ramifications; and the people who just prefer naively simple explanations picked by some stupid – and in generality, clearly incorrect – criteria followed by a repetitive rationalization and frantic but content-free attacks against everyone who disagrees with them.

by Luboš Motl (noreply@blogger.com) at August 14, 2019 03:55 AM

August 13, 2019

Emily Lakdawalla - The Planetary Society Blog

Chandrayaan-2 Headed for Lunar Orbit
At 02:21 IST (20:51 UTC), the spacecraft fired its main engine, changing its orbit to intersect with the Moon.

August 13, 2019 10:37 PM

Marco Frasca - The Gauge Connection

Where we are now?

Summer conferences passed by, we have more precise data on the Higgs particle and some new results were announced. So far, this particle appears more and more in agreement with the Standard Model expectations without no surprise in view. Several measurements were performed with the full dataset at 140 {\rm fb}^{-1}. Most commentators avoid to tell about this because it does not warrant click-bait anymore. At EPS-HEP 2019 in Ghent (Belgium), the following slide was presented by Hulin Wang on behalf of the ATLAS Collaboration

ZZ decay and higher resonances

There appears to be an excess at 250 GeV and another at 700 GeV but we are talking of about 2 sigma, nothing relevant. Besides, ATLAS keeps on seeing an excess in the vector boson fusion for ZZ decay, again about 2 sigma, but CMS sees nothing, rather they are somewhat on the missing side!

No evidence of supersymmetry whatsoever, neither the multiplet of Higgs nor charged Higgs are seen that could hint to supersymmetry. I would like to remember that some researchers were able to obtain the minimal supersymmetric standard model from string theory and so, this is a diriment aspect of the experimental search. Is the Higgs particle just the first one of an extended sector of electroweak (soft) supersymmetry breaking?

So, why could the slide I just posted be so important? The interesting fact is the factor 2 between the mass of this presumed new resonance and that of the Higgs particle. The Higgs sector of the Standard Model can be removed from it and treated independently. Then, one can solve it exactly and the spectrum is given by an integer multiple of the mass of the Higgs particle. This is exactly the spectrum of a Kaluza-Klein particle and it would represents an indirect proof of the existence of another dimension in space. So, if confirmed, we would move from a desolating scenario with no new (beyond standard model) physics in view to a completely overturned situation! We could send all the critics back to sleep wishing them a better luck for the next tentative.

Back to reality, the slide yields the result for the dataset of 36.1 {\rm fb}^{-1} and no confirmation from CMS has ever arrived. We can just hope that the dreaming scenario takes life.

by mfrasca at August 13, 2019 05:59 PM

CERN Bulletin

L`Association du personnel et vous

The Association, your representative to the Management and the Member States

The Article VII 1.01 of the Staff Rules and Regulations (SR&R) provides that "Independently of the hierarchical channels, the relations between the Director General and the personnel shall be established either on an individual basis or on a collective basis with the Staff Association as intermediary".

This essential role of the Association through the Staff Representatives, to be the spokespersons of all the Organization's personnel to the Director General and the Member States, is reflected in active participation in the various joint committees defined in the Staff Regulations but also in all the working groups and official bodies dealing with staff matters.

 The most important joint committee is the Standing Concertation Committee (SCC), which has been the forum for Concertation since 1983, when the Standing Concertation Committee (CCP) replaced the Standing Consultation Committee. 

Since then, the Concertation process has been enshrined in the Staff Rules and Regulations, which define its scope of application: “Any proposed measures of a general nature regarding the conditions of employment or association of members of the personnel shall be the subject of discussion within the SCC” (S VII 1.08). More generally, all questions relating to the employment and working conditions of the members of personnel are discussed in the SCC, including in particular issues of remuneration, social protection (CHIS and Pension Fund), career evolution.

What is meant by Concertation? Article S VII 1.07 of the Staff Regulations and Rules defines: “Discussion shall mean a procedure whereby the Director-General and the Staff Association concert together to try to reach a common position.”

The TREF, in its role as a tripartite forum, is the only place where your representatives meet with the delegates of the Member States, in the presence of the Management, to explain and defend the position of the personnel on all matters relating to employment conditions. Finally, the Finance Committee also gives the Association a voice in decisions concerning the personnel.

The Association, a force of persuasion... thanks to you

In order to properly fulfil its role as a representative, like mentioned above, and to be persuasive in front of its interlocutors, the Association needs not only a broad representation but also a broad support from the personnel, i.e. from all of you!

Join the Staff Association, but also come and learn about our public meetings, discuss your concerns with your Staff delegates from your department. Express your point of view by responding to our questionnaire and participate in the events we organise when we need to show your massive support to our decision-makers. All together and united, we will be stronger to ensure that CERN remains not only a laboratory of scientific excellence as it has been for more than 60 years, but also an example as an employer at the forefront of progress in the social field. In order to achieve these objectives, the Organization must be able to continue to attract, motivate and retain the best specialists from all Member States.

Become a Staff delegate, and be part of a collective, exchange your ideas, participate in the development of proposals, be involved in your employment conditions!

You are the Association

In order to prepare as well as possible for the next 5-Yearly-Review, whose subjects will be decided by the CERN Council next year (2020), the Staff Association will solicit in the coming weeks all CERN Employed members of personnel, staff members and fellows, through a survey that will allow you to tell us what you think of our current employment conditions and how you would like to see them evolve.

Your massive participation is therefore essential. This will also allow us to assign more precisely the priorities to the themes to be dealt with when the first discussions with the Management will begin next year.

In advance, we thank you for your participation and for the time you will devote to completing this questionnaire.

We need YOU to build YOUR employment conditions for CERN of tomorrow.

A survey in collaboration with the University of Geneva

In order to conduct this survey, the Staff Association has enlisted the support of the Institute of Demography and Socioeconomics of the University of Geneva.

The questionnaire will be hosted on their platform and only the results of the analyses will be communicated to the Staff Association in statistical form only in order to guarantee total anonymity of the participants.

Please do not be surprised to receive, in the coming weeks, an email from the University of Geneva inviting you to participate in a survey from the Staff Association!


Join us!

The coming years 2020 and 2021 will therefore be crucial. Come and support us in defending and developing our employment conditions in a positive way. Join the Association. Actively participate in the life of the Association by establishing regular contacts with your delegates and colleagues. Vote to elect your delegates and, why not, stand for election to the Staff Council, which will take place in this coming October and November.


 

 

 

 

 

From the start of the 2020 year, the new Council that will emerge from these elections will have the heavy task of defending the priorities of the personnel in the 5-Yearly-Review in front of the Management and the Member States at the TREF.

Then, at the end of 2020 and throughout 2021, our second mission will be to ensure that the proposals prepared by the Management in concertation with the Staff Association are properly implemented.

Become a Staff delegate; build your future at CERN!

 

 

August 13, 2019 05:08 PM

CERN Bulletin

Conference

The Staff Association is pleased to invite you to a conference:

August 13, 2019 03:08 PM

CERN Bulletin

CANOË KAYAK CERN

European Masters Games Torino 2019

Du 26 juillet au 4 août 2019 se disputaient les European Masters Games de Turin, près de 35000 sportifs âgés de plus de 35 ans avaient fait le déplacement pour se confronter dans les 29 disciplines proposées.

Parmis eux le canoëiste Olivier Barriere, prenait le départ dans les épreuves de slalom, qui se disputaient le samedi 3 août sur le stade d’eau vive d’Ivrea l’un des plus exigeant du circuit européen.

A l’issu de cette journée d’épreuves, le compétiteur du Canoë kayak CERN, qui défendait les couleurs de la France, a été médaillé dans trois catégories dans la tranche d’âge 45-50 ans :

  • l’argent en canoë monoplace individuel ;
  • l’or en canoë monoplace par équipe avec ses partenaires Charly Chassigneux et Martin Cote ; et pour finir
  • l’or en canoë biplace avec son équipier du club d’Oyonnax Davy Egraz.

Suite à cette formidable expérience, il vous encourage à participer aux prochains évènements Masters qui vont se dérouler dans le monde ces prochaines années, dont voici la liste :

 

August 13, 2019 03:08 PM

CERN Bulletin

INTERFON

Coopérative des fonctionnaires internationaux. Découvrez l'ensemble de nos avantages et remises auprès de nos fournisseurs sur notre site internet www.interfon.fr ou à notre bureau au bâtiment 504 (ouvert tous les jours de 12h30 à 15h30).

August 13, 2019 03:08 PM

CERN Bulletin

GAC-EPA

Le GAC organise des permanences avec entretiens individuels. La prochaine permanence se tiendra le :

Mardi 27 août de 13 h 30 à 16 h 00

Salle de réunion de l’Association du personnel

Les permanences du Groupement des Anciens sont ouvertes aux bénéficiaires de la Caisse de pensions (y compris les conjoints survivants) et à tous ceux qui approchent de la retraite.

Nous invitons vivement ces derniers à s’associer à notre groupement en se procurant, auprès de l’Association du personnel, les documents nécessaires.

Informations : http://gac-epa.org/

Formulaire de contact :

http://gac-epa.org/Organization/ContactForm/ContactForm-fr.php

August 13, 2019 03:08 PM

Emily Lakdawalla - The Planetary Society Blog

NASA, ESA Officials Outline Latest Mars Sample Return Plans
The current strategy includes the Mars 2020 rover, a lander carrying a rover and ascent vehicle, and an Earth return orbiter.

August 13, 2019 09:00 AM

August 12, 2019

Emily Lakdawalla - The Planetary Society Blog

Hayabusa2 Nailed its Second Touchdown on Asteroid Ryugu
JAXA's sample collection spacecraft touched down just 60 centimeters away from its aimpoint.

August 12, 2019 05:11 PM

August 11, 2019

Lubos Motl - string vacua and pheno

Four Tommaso Dorigo's SUGRA blunders
Almost all the media informed about the new Special Breakthrough Prize in Fundamental Physics (which will be given to the guys during a TV broadcast event on November 3rd; in NASA's Hangar One, Mountain View, CA) – a prize to three founders of supergravity – as if it were any other prize.

The winners are lucky to divide the $3 million and/or they deserve the award which was chosen by a nontrivial process, like in the case of the Nobel Prize or any other prize. Thankfully, in this case, most journalists didn't try to pretend that they know more about supergravity than the committee. The judgements or information about the importance of work in theoretical physics should be left to the experts because these are damn hard things that an average person – and even an average PhD – simply hasn't mastered.

I detected three amazing exceptions. Nature, Prospect Magazine, and Physics World wrote something completely different. The relevant pages of these media have been hijacked by vitriolic, one-dimensional, repetitive, scientifically clueless, deceitful, and self-serving anti-science activists and they tried to sling as much mud on theoretical physics as possible – which seems to be the primary job description of many of these writers and the society seems to enthusiastically fund this harmful parasitism.



It could be surprising, especially in the case of Nature and Physics World, because under normal circumstances, you would expect Nature and Physics World to be more expert-oriented and closer to the "scientific establishment". But the evolution of the media has produced the opposite outcome. The media that should be close to the scientific establishment are actually almost completely controlled by the self-anointed Messiahs – another branch of all those SJWs who want to destroy the civilized world as we have known it for centuries.

It's ironic but if you look at the reasons, it's logical. It has analogous reasons as the fact that the "inner cities" typically become the ghettos or homes to poor demographic groups – while the productive parts of the society typically have to move to more generic and less "central" suburbs. Similarly, the richest Western European countries are those that seem to be more likely to lose their civilized status very soon. What is the reason? Well, the most special and prosperous places – the inner cities or the rich Western countries – are those that also maximally attract the people who are destined to ruin them.

That's why the "most pro-science journals", inner cities, and wealthiest Western countries putrefy well before others.



Sadly, experimental particle physicist and blogger Tommaso Dorigo has partly joined these anti-civilization warriors. He wrote
My Take On The Breakthrough Prizes
where he repeats several deep misconceptions of the scientifically illiterate public. First, he recommended the three winners a particular way to spend the money. But Tommaso is no longer capable of even doing jokes properly, so let me fix his failed attempt. He advised
  • Ferrara to buy a new Ferrari
  • van Nieuwenhuizen to buy a newer housing in Malibu
  • and a new van for Dan Freedman for his bikes so that may become a truly freed man
OK, Dorigo failed in humor as well – now the more serious things. Dorigo says that it's good news that a rich guy named Milner has randomly decided to pay money for a failed theoretical idea named supergravity. Such a statement is wrong at every discernible level.

First, Dorigo completely misunderstood who picks the winners.

Future winners of the Breakthrough Prize in Fundamental Physics must first be nominated. I know everything about the process of nomination – because I am a nominator. But more importantly, Dorigo failed to read even the most elementary press release. If he had read it, he would know that
A Special Breakthrough Prize in Fundamental Physics can be awarded by the Selection Committee at any time, and in addition to the regular Breakthrough Prize awarded through the ordinary annual nomination process. Unlike the annual Breakthrough Prize in Fundamental Physics, the Special Prize is not limited to recent discoveries.
The quote above says that it is the Selection Committee that decides to grant this special prize – and it can do so at any moment. Is the committee composed of Milner? Or Milner and Zuckerberg? Not at all. Just do a simple Google search and you will find the composition of the Selection Committee. You will find out that the committee consists of the winners of the full-sized Breakthrough Prize in Fundamental Physics – the page contains names of 28 men alphabetically sorted from Arkani-Hamed to Witten (the list of men is surely open to hypothetical women as well). There is no Milner or Zuckerberg on the committee.

(After the SUGRA update, the list will include 4 former co-authors of mine. So I should also win the prize by default, without the needless bureaucracy.)

So you can see, the collection of the winners so far does exactly the same thing during their meetings as members of the Arista that Feynman was once admitted to: to choose who else is worthy to join the wonderful club of ours! ;-) Feynman didn't like it – because he didn't like any honors or the related pride about the status – but if you look at it rationally, you will agree that it's the "least bad" way of choosing new winners.

I find it puzzling that despite Dorigo's (and similar people's) obsession with the money, awards, and all the sociological garbage, he was incapable of figuring out whether the new winners are picked by Milner or by top physicists. It's the latter, Tommaso. You got another failing grade.

The main failing grade is given for the ludicrous comments about the "failed supergravity", however.

Well, to be sure that his dumb readers won't miss it, he wrote that supergravity was a "failed theory" not once but thrice:
I'll admit, I wanted to rather title this post "Billionaire Awards Prizes To Failed Theories", just for the sake of being flippant. [...]

It is a sad story that SUGRA never got a confirmation by experiment to this day, so that it remains a brilliant, failed idea. [...]

(SUGRA is, to this day, only a beautiful, failed theory)
Sorry, Tommaso, but just like numerous generic crackpots who tightly fill assorted cesspools on the Internet, you completely misunderstand how the scientific method works. A theory cannot become "failed" for its not having received an experimental proof yet.

On the contrary, the decisions about the validity of scientific theories are all about the falsification. For a scientific theory or hypothesis to become failed, one has to falsify it – i.e. prove that it is wrong. The absence of a proof in one way or another isn't enough to settle the status of a theory.

Instead, a theory or hypothesis must be in principle falsifiable – which SUGRA is – and once it's discovered, defined, or formulated, it becomes provisionally viable or temporarily valid up to the moment when it's falsified. And that's exactly the current status of SUGRA: it is provisionally viable or temporarily valid.

A physicist must decide whether the Einsteinian general relativity with or without the local supersymmetry – GR or SUGRA – seems like the more likely long-distance limit of the effective field theories describing Nature (in both cases, GR or SUGRA must be coupled to extra matter). But the actual experts who study these matters simply find SUGRA to be more likely for advanced reasons (realistic string vacua seem to need SUSY, naturalness, and others) – so SUGRA is the default expectation that will be considered provisionally valid up to the moment when it's ruled out.

In a typical case of falsification, an old theory is ruled out simultaneously with some positive evidence supporting an alternative, usually newer, theory.

But even if you adopted some perspective or counting in which SUGRA is not the default expectation about the relevant gravitational local symmetries in Nature, supergravity is still found in 176,000 papers according to the Google Scholar. It's clearly a theory that has greatly influenced physics according to the physicists. Of course the sane science prizes should exhibit some positive correlation with the expert literature. A layman may claim to know more than the theoretical physicists but it's unwise.

Everyone who writes that SUGRA is a "failed idea" is just a scientifically illiterate populist writer who clearly has nothing to do with good science of the 21st century – and whose behavior is partly driven by the certainty that he or she could never be considered as a possible winner of an award that isn't completely rigged. Sadly, Tommaso Dorigo belongs to this set. He may misunderstand why good physicists consider SUGRA to be the "default expectation" – that would be just ignorance, an innocent fact that Dorigo has no chance to make it to the list from Arkani-Hamed to Witten.

However, he is a pompous fool because he also brags about this ignorance. He boasts how wonderfully perfumed the cesspool where he belongs is.

Egalitarianism

If you're not following the failing grades, Dorigo has gotten three of them so far: for the inability to convey good jokes if he tries; for the misunderstanding of the decisions that pick the new winners; and for the misunderstanding what you need to make a theory "failed" in science. He deserves the fourth failing grade for the comments at the end of his text. He tried to emulate my "memos" but his actual memo – in an article about supergravity! – is that the inequality in the world is the principal cancer that must be cured.

Holy cow. First of all, such totally ideological comments are out of place in an article pretending to be about supergravity – but if he deserved a passing grade, he would have written that the real cancer is egalitarianism, Marxism, and especially its currently active mutation, neo-Marxism. This is the disease of mankind that all decent people are trying to cure right now!

by Luboš Motl (noreply@blogger.com) at August 11, 2019 12:34 PM

Jon Butterworth - Life and Physics

Space Shed at Latitude
I did an interview with Jon Spooner, Director of Human Space Flight at the Unlimited Space Agency at Latitude 2018. It is now available as a podcast, which you can listen to here (Series 1, Episode 3). It is intended to … Continue reading

by Jon Butterworth at August 11, 2019 07:25 AM

August 10, 2019

Jon Butterworth - Life and Physics

“The Land of Crap Rasputins”
I have been feeling a bit adrift since the grave mistake of 2016, but I think I might be able to cope with being a citizen of “the land of crap Rasputins”. So thank you Marina Hyde. And if the next … Continue reading

by Jon Butterworth at August 10, 2019 09:19 AM

August 09, 2019

John Baez - Azimuth

2020 Category Theory Conferences

 

Yes, my last post was about ACT2019, but we’re already planning next year’s applied category theory conference and school! I’m happy to say that Brendan Fong and David Spivak have volunteered to run it at MIT on these dates:

• Applied Category Theory School: June 29–July 3, 2020.
• Applied Category Theory Conference: July 6–10, 2020.

The precise dates for the other big category theory conference, CT2020, have not yet been decided. However, it will take place in Genoa sometime in the interval June 18–28, 2020.

There may also be an additional applied theory school in Marrakesh from May 25–29, 2020. More on that later, with any luck!

And don’t forget to submit your abstracts for the November 2019 applied category theory special session at U. C. Riverside by September 3rd! We’ve got a great lineup of speakers, but anyone who wants to give a talk—including the invited speakers—needs to submit an abstract to the AMS website by September 3rd. The AMS has no mercy about this.

by John Baez at August 09, 2019 06:12 AM

August 08, 2019

ZapperZ - Physics and Physicists

RIP J. Robert Shrieffer
I'm sad to hear the passing of a giant in our field, and certainly in the field of Condensed Matter Physics. Nobel Laureate J. Robert Schrieffer has passed away at the age of 88. He is the "S" in BCS theory of superconductivity, one of the most monumental theories of the last century, and one of the most cited. So "complete" was the theory that, by early 1986, many people thought that the field of superconductivity has been fully "solved", and that nothing new can come out of it. Of course, that got completely changed after that.

Unfortunately, I wasn't aware of his predicament during the last years of Schrieffer's life. I certainly was not aware that he was incarcerated for a while.

Late in life, Dr. Schrieffer’s love of fast cars ended in tragedy. In September 2004, he was driving from San Francisco to Santa Barbara, Calif., when his car, traveling at more than 100 miles per hour, slammed into a van, killing a man and injuring seven other people.

Dr. Schrieffer, whose Florida driver’s license was suspended, pleaded no contest to felony vehicular manslaughter and apologized to the victims and their families. He was sentenced to two years in prison and released after serving one year.

Florida State placed Dr. Schrieffer on leave after the incident, and he retired in 2006.

I've met him only once while I was a graduate student, and he was already at Florida State/NHML at that time. His book and Michael Tinkham's were the two that I used when I decided to go into superconductivity.

Leon Cooper is the only surviving members left of the BCS trio.

Zz.

by ZapperZ (noreply@blogger.com) at August 08, 2019 07:59 PM

Jon Butterworth - Life and Physics

Current World View
Situated at one end of “The Long Walk”, with Windsor Castle at the other, the view of the Copper Horse is awesome. Some people even climb the plinth it to get a better view… Some people…

by Jon Butterworth at August 08, 2019 11:14 AM

August 07, 2019

Lubos Motl - string vacua and pheno

Andy Strominger becomes lead cheerleader at Greene's festival
Carlo Rubbia demands particle physicists to be courageous and build the damn muon collider, a compact Higgs factory.
A few days ago, the World Science Festival of Brian Greene posted a 90-minute video with interviews about the state of fundamental physics:



Bill Zajc sent it to me and I finally had the time and energy to watch it, at a doubled speed. At the beginning, four minutes of Greene and visual tricks – similar to those from his PBS TV shows – are shown. I actually think that some of the tricks are new and even cooler than they used to be. I really liked the segment where Greene was grabbing and magnifying the molecules and taking the atoms, nuclei, and strings out of them. The illustrations of the microscopic building blocks had to be created to match the motion of Greene's hands.



Greene had three guests whom he interviewed separately, in 30-minute segments: Marcelo Gleiser who spoke like a historian and philosopher; Michael Dine who covered the search for supersymmetry etc.; and Andrew Strominger who amusingly discussed the exciting quantum gravity twists of the reductionist stringy program.



I know Dine and Strominger very well – not only as co-authors of papers. I have never met Gleiser. Some Brazilian commenters mention he is unknown in Brazil but he should be famous, partly because he is a DILF – I suppose it meant a Daddy [He or She] would Like to Fudge. Gleiser and Greene discuss the history of unification and a model of expanding knowledge (an island) which also expands the ignorance (the boundary between the known island and the unknown sea). I thought that David Gross insisted that it was his metaphor but OK.

Michael Dine has been a phenomenologist but he bragged that he was one of the few who were cautiously saying that SUSY may remain undiscovered after a few years ;-). I don't really think that this skepticism was rare (my 2007 predictions said that the probability for the LHC SUSY discovery was 50% and I think it agreed with what people were saying).

The real issue is that when you're skeptical about discoveries around the corner, you don't write papers about it – because you really don't think that you have anything interesting to write about. That's why the literature about this topic is naturally dominated by the people who did expect speedy discoveries of SUSY. But the actual percentage of the HEP people who expected speedy discoveries of BSM physics wasn't much higher than 1/2 and maybe it was lower than 1/2. The skeptics avoided writing useless and unmotivated papers and I think it's the right thing to do. One must just be careful not to misinterpret the composition of the literature as the quantification of some collective belief – but it's the interpreter's fault if this invalid interpretation of the composition of the papers is made.

OK, so Greene said that 90% of Dine's work may be wrong or useless and Dine roughly agreed.

Andy Strominger (whose 94-year-old father just got a new 5-year early career grant) pumped quite a different level of excitement into the festival. The research turned out to be much more exciting than what people expected. People expected some reductionist walk towards shorter distances and higher energies, they would finally find the final stringy constituents, strings, and that would be the end of it. Physics departments would shut their doors and Brian Greene would pour champagne on his festivals, or something like that, Andy said. ;-)

What happened was something else. People found the black holes, their puzzling quantum gravity behavior, holography, connections with superconductors and lots of other things. This is better than just "some final particles inside quarks". I agree with that. Strominger's metaphor involves Columbus. You can see that Andy is not quite the extreme progressive left – because those wouldn't dare to speak about the imperialist villain Columbus without trashing him.

OK, Columbus promised to get to China. Instead, he discovered America. Some of his sponsors were disappointed that he only discovered some America and not China. These days, people may be happy because America is better than China. Andy forgot to describe the future evaluation. Maybe, in a century or so from now, people will be disappointed again that Columbus found just America and not China because China may be better. But it could be good news for Andy (he's at least impartial) – who has spent quite some time in China and who actually speaks Chinese.

While Dine predicted that the field would shrink, Strominger's description implicitly says that the field is growing. I agree with Andy – but a part of the difference is that Andy and Michael don't have quite the same field. Dine is a phenomenologist and the contact with the experiment is almost a defining condition of his field or subfield. Strominger is a formal theorist so he can do great advances independently of experimental tests.

It would be a surprise for the people in 1985 if they saw how the string theory research has evolved. In 1985, Witten expected that in a few weeks, the right compactification would be found and that would be the end of physics. Instead, all the complex branches of knowledge were found. With hindsight, it's almost obvious that what has happened had to happen. The one-direction reductionist approach works well but only up to the Planck scale.

There are no sub-Planckian, physically distinguishable distances. So it should have been clear, even in advance, that the one-direction journey had to become impossible or multi-directional once the true phenomena near the fundamental scale are probed. And that's what happened. Most of the stringy quantum gravity research makes it very clear that it's not a part of quantum field theory that respects the one-directional quantum field theory renormalization flows. All phenomena are "equally fundamental" at the Planck scale, there is no clear explanatory direction over there anymore. People unavoidably study effects at distances that no longer shrink (you shouldn't shrink beneath the Planck length) and they study increasingly complex behavior (starting with the entanglement) of the building blocks that are roughly Planckian in size.

A viewer who has observed the exchange carefully could have seen some additional, funny subtle differences between Greene and Strominger. Greene wanted to introduce black hole thermodynamics with stories worshiping John Wheeler. It was all about Wheeler and... about his student Jacob Bekenstein. Meanwhile, Andy Strominger responded by completely ignoring this thread and talking about his late friend Hawking only. Strominger told us that Hawking had a tomb with the Bekenstein-Hawking entropy on it.



Well, Andy has conflated Boltzmann's and Hawking's tomb. Boltzmann has \(S=k\cdot \log W\) over there but Hawking has the temperature formula\[

T = \frac{\hbar c^3}{8\pi G M k}

\] on his grave, along with "here lies what was mortal of Stephen Hawking, 1842-2018", indicating that most of Stephen Hawking is immortal and eternal. I just wanted to return some rigor to the discussion of graves here, Andy.

All the guests turned out to be amusing narrators but it's pretty funny that Andy Strominger ended up as the most bullish and enthusiastic guest. Why? Fifteen years ago, Andy was invited to record some monologues for the PBS NOVA TV shows of Brian Greene, much like Cumrun Vafa and others. Vafa was totally excited about the TV tricks. They made his head explode! He watched it with his two sons, and when Vafa's and Bugs Bunny's heads exploded, all three boys were happy.

(Vafa has 2 sons, Strominger has 4 daughters. If this perfect correlation were the rule in Israel and Iran, you could be optimistic about the fudging happy end of the conflict of the two countries LOL.)

On the other hand, those 15 years ago, Andy Strominger was less enthusiastic because all his clips were removed from the show. Maverick Strominger's comments were insufficiently enthusiastic for Greene's simple, pre-determined, bullish tone. So now, when maverick Strominger is the cheerleader-in-chief, you may be pretty sure that most people speak rather negatively and pump lots of the disillusion into the discourse.

by Luboš Motl (noreply@blogger.com) at August 07, 2019 06:47 PM

Axel Maas - Looking Inside the Standard Model

Making connections
Over time, it has happened that some solution in one area of physics could also be used in a quite different area. Or, at least, inspired the solution. Unfortunately, this does not always work. Even quite often it happened that when reaching the finer points it turns out that something promising did in the end not work. Thus, it pays off to be always careful with such a transfer, and never believe a hype. Still, in some cases it worked, and even lead to brilliant triumphs. And so it is always worthwhile to try.

Such an attempt is precisely the content of my latest paper. In it, I try to transfer ideas from my research on electroweak physics and the Brout-Englert-Higgs effect to quantum gravity. Quantum gravity is first and foremost still an unsolved issue. We know that mathematical consistency demands that there is some unification of quantum physics and gravity. We expect that this will be by having a quantum theory of gravity. Though we are yet lacking any experimental evidence for this assumption. Still, I also make the assumption for now that quantum gravity exists.

Based on this assumption, I take a candidate for such a quantum gravity theory and pose the question what are its observable consequences. This is a question which has driven me since a long time in particle physics. I think that by now I have an understanding of how it works. But last year, I was challenged whether these ideas can still be right if there is gravity in the game. And this new paper is essentially my first step towards an answerhttps://arxiv.org/abs/1908.02140. Much of this answer is still rough, and especially mathematically will require much work. But at least it provides a first consistent picture. And, as advertised above, it draws from a different field.

The starting point is that the simplest version of quantum gravity currently considered is actually not that different from other theories in particle physics. It is a so-called gauge theory. As such, many of its fundamental objects, like the structure of space and time, are not really observable. Just like most of the elementary particles of the standard model, which is also a gauge theory, are not. Thus, we cannot see them directly in an experiment. In the standard model case, it was possible to construct observable particles by combining the elementary ones. In a sense, the particles we observe are bound states of the elementary particles. However, in electroweak physics one of the bound elementary particles totally dominates the rest, and so the whole object looks very similar to the elementary one, but not quite.

This works, because the Brout-Englert-Higgs effect makes it possible. The reason is that there is a dominating kind of not observable structure, the so-called Higgs condensate, which creates this effect. This is something coincidental. If the parameters of the standard model would be different, it would not work. But, luckily, our standard model has just the right parameter values.

Now, when looking at gravity around us, there is a very similar feature. While we have the powerful theory of general relativity, which describes how matter warps space, we rarely see this. Most of our universe behaves much simpler, because there is so little matter in it. And because the parameters of gravity are such that this warping is very, very small. Thus, we have again a dominating structure: A vacuum which is almost not warped.

Using this analogy and the properties of gauge theories, I figured out the following: We can use something like the Brout-Englert-Higgs effect in quantum gravity. And all observable particles must still be some kind of bound states. But they may now also include gravitons, the elementary particles of quantum gravity. But just like in the standard model, these bound states are dominated by just one of its components. And if there is a standard model component it is this one. Hence, the particles we see at LHC will essentially look like there is no gravity. And this is very consistent with experiment. Detecting the deviations will be so hard in comparison to those which come from the standard model, we can pretty much forget about it for earthbound experiments. At least for the next couple of decades.

However, there are now also some combinations of gravitons without standard model particles involved. Such objects have been long speculated about, and are called geons, or gravity balls. But in contrast to the standard model case, they are not stable classically. But they may be stabilized due to quantum effects. The bound state structure strongly suggests that there is at least one stable one. Still, this is pure speculation at the moment. But if they are, these objects could have dramatic consequences. E.g., they could be part of the dark matter we are searching for. Or, they could make up black holes very much like neutrons make a neutron star. I have no idea, whether any of these speculations could be true. But if there is only a tiny amount of truth in it, this could be spectacular.

Thus, some master students and I will set out to have a look at these ideas. To this end, we will need to some hard calculations. And, eventually, the results should be tested against observation. These will be coming form the universe, and from astronomy. Especially from the astronomy of black holes, where recently there have been many interesting and exciting developments, like observing two black holes merge, or the first direct image of a black hole (obviously just black inside a kind of halo). These are exciting times, and I am looking forward to see whether any of these ideas work out. Stay tuned!

by Axel Maas (noreply@blogger.com) at August 07, 2019 08:37 AM

August 06, 2019

Lubos Motl - string vacua and pheno

Milner millions for SUGRA
This award is unlikely to erase our sorrow about the two weekend accidents in the mountains (see two previous blog posts) but... As The Symmetry Magazine and many others tell us, three men are going to share a special $3 million Breakthrough Prize in Fundamental Physics for the development of supergravity (or SUGRA for short).



I almost forgot that we have also been to Amsterdam in July 2002. Dan, Martijn, Hong

The three men who deserve the award for their advances especially in 1976 are Sergio Ferrara, Daniel Z. Freedman, and Peter van Nieuwenhuizen. Needless to say, while I could post pictures of all of them (Ferrara at Harvard and van Nieuwenhuizen in Stony Brook), I have been much closer to Dan Freedman (MIT), as a co-author of our pp-wave Paper of Seven that is still anxiously waiting for its 250th citation, and as a partner on bicycles who actually has strong leg muscles to be able to disappear from my horizon whenever he wants.

Congratulations, Dan! And others.



It's very natural for supergravity to be finally rewarded – and I admit that I have sometimes missed that this was the natural bunch of people who were still waiting for this prize.



Supergravity may be said to stand for "supersymmetric gravity" – and it's a combination or intersection of the principles of Einstein's general theory of relativity; and supersymmetry. Because both general relativity and supersymmetry are beautiful, supergravity is clearly super-beautiful.

Supersymmetry emerged as a clever loophole – a symmetry that is neither acting on the spacetime in the usual ways (like the Poincaré symmetry), nor it is an internal symmetry that preserves spacetime points and acts "inside each point" separately. Supersymmetry is something in between them – which is only mathematically allowed because its generators are fermionic operators. Their anticommutator includes the momentum, the generator of spacetime translations:\[

\{Q_\alpha,Q_\beta\} = \Gamma^\mu_{\alpha\beta} P_\mu + \dots

\] Well, we usually need to distinguish dotted and undotted (chiral) spinor indices, add some gamma matrix for complex conjugation, and the dots may include "central charges". Supersymmetry was independently found both in the West and in the East – as separated by the Iron Curtain. In Russia, it was really found from algebraic considerations above. In the West, Pierre Ramond was incorporating fermions on the world sheet of string theory and found the world sheet supersymmetry – by seeing that it's an algebra obeyed by a particular 2D theory with bosons and fermions. We are in the early 1970s now.

Around 1974, Wess and Zumino would construct the first interacting four-dimensional supersymmetric theory – a scalar superfield with a cubic superpotential. This kind of work was going to expand around 1980 when physicists constructed the Minimal Supersymmetric Standard Model. SUSY GUT (grand unified theory) was added later. All these theories were non-gravitational.

In 1976, the new winners of the award constructed the first usable supergravity formalism for \(D=4\) and higher. It was simply a classical field theory which had the metric tensor – but to make SUSY easier, the metric tensor is replaced by the vielbein \(e^\mu_a\) – and the spin-3/2, Rarita-Schwinger field \(\psi_{\mu,\alpha}\) that carries both a vector and a spinor index (the spin is \(1+1/2=3/2 = 2-1/2\)). That fermionic field is the superpartner of the metric. They were able to define the supersymmetric variations of these fields so that the equations of motion were covariant under these transformations. The variations and equations have many terms so the underlying beauty of the local supersymmetry may look quite messy, disproving the idea that everything that is beautiful must look like a super-short formula on the T-shirt.

This discovery was destined to be generalized in the coming years. Higher-dimensional supergravity theories were found, culminating with the "prettiest" and "highest-dimensional" 11-dimensional supergravity (Cremmer-Julia-Scherk), and various other generalizations such as gauged supergravity theories were constructed. While supersymmetry was being added to the string theory world sheet already in the 1970s, starting with Ramond's work, Green and Schwarz only began to study a full-blown supersymmetric string theory or superstring theory in the early 1980s. When we say just "string theory" today, we almost always mean just the supersymmetric string theory that got mature in the early 1980s.

Only in 1995, Witten figured out that the most beautiful supergravity theory in 11 dimensions was actually linked to string theory. Being UV-completed as M-theory, it was the strong coupling limit of type IIA or \(E_8\times E_8\) heterotic (a duality discovered by Hořava+Witten) string theory. So only in the 1990s, the supergravity and string theory "communities" have really merged. The tight and important relationships between the theories in both "communities" became clear. Pretty much every string theorist understood that SUGRA folks weren't completely wrong to study non-renormalizable theories because many of SUGRA theories emerge as the long-distance limit of string theories; and SUGRA folks have generally also appreciated that string theorists hadn't been a useless thing because SUGRA is inconsistent by itself and needs a stringy completion at Planckian energies to restore the consistency.

Just to be sure, this merger wasn't an example of a perfect melting pot. It resembled the reunification of Germany instead. People couldn't throw away the culture that had shaped them for decades. So e.g. Sergio Ferrara, Daniel Z. Freedman, and Peter van Nieuwenhuizen remained top "supergravity theorists" who wouldn't have too strong reasons (and powers) to be reshaped into string theorists.

SUGRA may be considered one of the main culminations – or the main culmination – of the research of (superficially) local quantum field theories with a "locally looking action" that are as beautiful as possible. The local diffeomorphism symmetry (Einstein's localization of the symmetry generated by the momentum \(P_\mu\)) is extended to include the local supersymmetry, too. Because the spinorial (fermionic) components grow exponentially and the bosonic degrees of freedom grow polynomially, special dimensions are needed for SUSY to work – and 11 spacetime dimensions is the maximum dimension in which the super-Poincaré symmetry may be defined. At the same moment, SUGRA needs to be quantized, for the fermions to have "beef" (classically, fermionic fields can only take the value zero) but the quantization makes it inconsistent at the loop level, so SUGRA is already a "transient" theory that makes the transition to "stringy" theories with extended objects unavoidable.

Note that the formalism of SUGRA may become really a repulsive pile of technicalities with many terms. For example, the BRST symmetry generated by \(Q\) – with its cohomologies that many people already find too abstract – isn't "complex enough" to deal with SUGRA. Instead, because of the constraints that don't form a normal Lie algebra, to do the quantization properly, supergravity theories normally need a generalization of the BRST approach, the BV (Batalin-Vilkovisky) formalism.

Good job, Gentlemen.

Needless to say, we hear questions whether SUGRA is relevant for Nature around us. Ferrara says that the Higgs needed 60 years to be discovered – well, it was really less than 50, but OK – and we will need a similar period of time to observe SUGRA. He hasn't indicated what is his scenario for observing SUGRA. I find such an observation rather unlikely – at most, we will see isolated particles that will be consistent with a "gravitino" etc. – but I am confident that SUGRA has to be an approximate description of a part of Nature, anyway. Among my "SUSY exists because", the blog post "SUSY exists because the number 3/2 cannot be missing" is the most relevant one. At any rate, string/M-theory is unavoidable and only its supersymmetric versions are promising, and those have gravitinos and local supersymmetry.

We haven't seen any "direct empirical evidence" for SUGRA but the same comment applies to the Hawking radiation and other things. Their status – how much we know they are relevant in Nature – is comparable. Especially because we see that some of the confirmations may take a very long time, we must simply be capable of evaluating theories well before the confirmations arrive – otherwise science would change to empty promises of experimental discoveries. Instead, theoretical physics is a well-defined discipline with its theoretical – but indirectly empirically rooted – rules.

The three men started a transformation that has changed the default expectation about the relevance of local SUSY. I think that the contemporary competent physicists simply do believe that SUGRA is more likely to be relevant in Nature than not, and you would need another "opposite" discovery to change this default expectation. The change of the expectation is an important contribution to science. Science doesn't proceed just by breakthroughs that make everything certain (nothing is ever quite certain). Everyone who says otherwise is deeply misguided.



A five-minute-long guide how to find gravitino balls – especially on high-security planets – and prove supergravity.

P.S.: Nature printed a story by an aggressive inkspiller named Zeeya Merali who had to call supergravity "speculative" in the title and place "supergravity" in quotation marks. Dear speculative "Zeeya", there is absolutely no justification for inserting supergravity – well-established concept in physics – in quotation marks and it's just plain idiotic to call the theory "speculative". (The quotes around "Zeeya" are fine because unlike supergravity, "Zeeya" isn't a valid English word.) It's just a damn theory – every theory in science is "speculative" to one extent or another. Also, to show that the number of redundant dumb attacks hasn't been enough to satisfy her, the subtitle repeats that SUGRA "might not be a good description of reality". Or, more importantly, it can be one. Even if it were just a matter of reasonable probability that SUGRA is relevant in Nature, they would deserve an award. Even if it were just an amazing piece of mathematics, they would deserve a similar award.

by Luboš Motl (noreply@blogger.com) at August 06, 2019 06:51 PM

Jon Butterworth - Life and Physics

Future echoes
I have three email inboxes, for UCL, CERN and personal stuff. Catching up on some CERN bulk mails of the last few weeks, three interesting articles I thought worth sharing. One is a report from the “Future Circular Collider” (FCC) … Continue reading

by Jon Butterworth at August 06, 2019 03:41 PM

Lubos Motl - string vacua and pheno

Ann Nelson: 1958-2019
Bill Zajc has trained me to expect e-mails that I really enjoy reading – and it still worked very well an hour ago when I saw an e-mail about some new videos about string theory from Brian Greene's festival, including comments by Andy Strominger, Michael Dine, and others (80 minutes, "Loose Ends").

However, that average was crippled when I learned that:


Tragic.

Ann, an impressive peakbagger (see this list, including 264 peaks in her bag), was there with David B. Kaplan, her husband and also a particle physicist. I knew both – at least from a visit to University of Washington, Seattle. (I only had hours to see the Needle Tower with the Flying Saucer at the top... and it was raining bad.) David B. Kaplan – not to be confused with David E. Kaplan – was also a co-author of the deconstruction paper of ours.

During their long-planned backpack trip in the Central Cascades' Alpine Lakes Wilderness (Washington State, click for a map), after standing on loose rock, she fell down a gully on the traverse from the Necklace Valley to the West Fork Foss River Valley (pictures from the area; they were going from the red needle in the East to the West on this map).



Ann Nelson was a world class particle physicist – and the number of women who are or were this good or better in particle physics was at most a dozen.

One measure is to ask: "When was her or his work discussed on TRF most recently?" Ann Nelson's paper was discussed two weeks ago. I think it's a realistic and creative idea, like many papers before that. For example, she was a co-mother of the little Higgs. Ann+Nelson appears in 10 TRF texts.



Inspire assigns her with 18,000 citations from 127 papers, quite a track record – and 12 papers are above 500 citations ("renowned"). Well, less happily, she was also an avid identity politics advocate (you may find a TRF entry about that, too; that orientation largely boils down to her being a "finding" of Howard Georgi's hunt for minority physicists in the 1980s) – and a keen yet fallible hiker.



I took this picture at Harvard, and I believe it was on February 16th, 2005 (although the visit I perceived more closely was the later one in January 2006)

Ann's most recent "her own" tweet (reply) were condolences to Catherine Freese who was devastated because her cat died. The weekend news looks more sorrowful.

RIP, Ann. Condolences, especially to David and their two children. Too bad, this is probably not the last tragic HEP news that you're gonna read today.

by Luboš Motl (noreply@blogger.com) at August 06, 2019 03:17 PM

ZapperZ - Physics and Physicists

Light Drags Electrons Backward?
As someone who was trained in condensed matter physics, and someone who also worked in photoemmission, light detectors, and photoelectron sources, research work on light interaction with solids, and especially with metallic surfaces, is something I tend to follow rather closely.

I've been reading this article for the past few days and it gets fascinating each time. This is a report on a very puzzling photon drag effect in metals, or in this case, on gold, which is the definitive Drude metal if there is any. What is puzzling is not the photon drag on the conduction electron itself. What is puzzling is that the direction of the photon drag appears to be completely reversed between the effect seen in vacuum versus in ambient air.

A review of the paper can be found here. If you don't have access to PRL, the arXiv version of the paper can be found here. So it appears as if that, when done in vacuum, light appears to push the conduction electrons backward, while when done in air, it pushes electrons forward as expected.

As they varied the angle, the team measured a voltage that largely agreed with theoretical expectations based on the simple light-pushing-electrons picture. However, the voltage they measured was the opposite of that expected, implying that the current flow was in the wrong direction. It’s a weird effect," says Strait. “It’s as if the electrons are somehow managing to flow backward when hit by the light.”
Certainly, surface effects may be at play here. And those of us who have done photoemission spectroscopy can tell you all about surface reconstruction, even in vacuum, when a freshly-cleaved surface literally changes characteristics right in front of your eyes as you continually perform a measurement on it. So I am not surprised by the differences detected between vacuum and in-air measurement.

But what is very puzzling is the dramatic difference here, and why light appears to push the conduction electrons one way in air, and in the opposite direction in vacuum. I fully expect more experiments on this, and certainly more theoretical models to explain this puzzling observation.

This is just one more example where, as we apply our knowledge to the edge of what we know, we start finding new mysteries to solve or to explain. Light interaction with matter is one of the most common and understood phenomena. Light interaction with metals is the basis of the photoelectric effect. Yet, as we push the boundaries of our knowledge, and start to look at very minute details due to its application in, say, photonics, we also start to see the new things that we do not expect.

It is why I always laugh whenever someone thinks that there is an "end of physics". Even on the things that we think we know or things that are very common, if we start to make better and more sensitive measurement, I don't doubt that we will start finding something else that we have not anticipated.

Zz.

by ZapperZ (noreply@blogger.com) at August 06, 2019 02:15 PM

Matt Strassler - Of Particular Significance

A Catastrophic Weekend for Theoretical High Energy Physics

It is beyond belief that not only am I again writing a post about the premature death of a colleague whom I have known for decades, but that I am doing it about two of them.

Over the past weekend, two of the world’s most influential and brilliant theoretical high-energy physicists — Steve Gubser of Princeton University and Ann Nelson of the University of Washington — fell to their deaths in separate mountain accidents, one in the Alps and one in the Cascades.

Theoretical high energy physics is a small community, and within the United States itself the community is tiny.  Ann and Steve were both justifiably famous and highly respected as exceptionally bright lights in their areas of research. Even for those who had not met them personally, this is a stunning and irreplaceable loss of talent and of knowledge.

But most of us did know them personally.  For me, and for others with a personal connection to them, the news is devastating and tragic. I encountered Steve when he was a student and I was a postdoc in the Princeton area, and later helped bring him into a social group where he met his future wife (a great scientist in her own right, and a friend of mine going back decades).  As for Ann, she was one of my teachers at Stanford in graduate school, then my senior colleague on four long scientific papers, and then my colleague (along with her husband David B. Kaplan) for five years at the University of Washington, where she had the office next to mine. I cannot express what a privilege it always was to work with her, learn from her, and laugh with her.

I don’t have the heart or energy right now to write more about this, but I will try to do so at a later time. Right now I join their spouses and families, and my colleagues, in mourning.

by Matt Strassler at August 06, 2019 12:35 PM

August 03, 2019

ZapperZ - Physics and Physicists

Einstein's Blunder Explained
This, actually, is a good and quick summary of the Einstein cosmological equation by Minute Physics. You'll get a brief history of the cosmological constant, and how it came back to life.



Zz.

by ZapperZ (noreply@blogger.com) at August 03, 2019 12:59 PM

July 29, 2019

Clifford V. Johnson - Asymptotia

News from the Front XIX: A-Masing de Sitter

[caption id="attachment_19335" align="alignright" width="215"] Diamond maser. Image from Jonathan Breeze, Imperial College[/caption]This is part 2 of a chat about some recent thoughts and results I had about de Sitter black holes, reported in this arxiv preprint. Part 1 is here, so maybe best to read that first.

Now let us turn to de Sitter black holes. I mean here any black hole for which the asymptotic spacetime is de Sitter spacetime, which is to say it has positive cosmological constant. This is of course also interesting since one of the most natural (to some minds) possible explanations for the accelerating expansion of our universe is a cosmological constant, so maybe all black holes in our universe are de Sitter black holes in some sense. This is also interesting because you often read here about explorations of physics involving negative cosmological constant, so this is a big change!

One of the things people find puzzling about applying the standard black hole thermodynamics is that there are two places where the standard techniques tell you there should be a temperature associated with them. There's the black hole horizon itself, and there's also the cosmological horizon. These each have temperature, and they are not necessarily the same. For the Schwarzschild-de Sitter black hole, for example, (so, no spins or charges... just a mass with an horizon associated with it, like in flat space), the black hole's temperature is always larger than that of the cosmological horizon. In fact, it runs from very large (where the black hole is small) all the way (as the black hole grows) to zero, where the two horizons coincide.

You might wonder, as many have, how to make sense of the two temperatures. This cannot, for a start, be an equilibrium thermodynamics system. Should there be dynamics where the two temperatures try to equalise? Is there heat flow from one horizon to another, perhaps? Maybe there's some missing ingredient needed to make sense of this - do we have any right to be writing down temperatures (an equilibrium thermodynamics concept, really) when the system is not in equilibrium? (Actually, you could ask that about Schwarzschild in flat space - you compute the temperature and then discover that it depends upon the mass in such a way that the system wants to move to a different temperature. But I digress.)

The point of my recent work is that it is entirely within the realm of physics we have to hand to make sense of this. The simple system described in the previous post - the three level maser - has certain key interconnected features that seem relevant:

  • admits two distinct temperatures and
  • a maximum energy, and
  • a natural instability (population inversion) and a channel for doing work - the maser output.

My point is that these features are all present for de Sitter black holes too, starting with the two temperatures. But you won't see the rest by staring at just the Schwarzschild case, you need to add rotation, or charge (or both). As we shall see, the ability to reduce angular momentum, or to reduce charge, will be the work channel. I'll come back to the maximum [...] Click to continue reading this post

The post News from the Front XIX: A-Masing de Sitter appeared first on Asymptotia.

by Clifford at July 29, 2019 06:03 PM

July 26, 2019

Clifford V. Johnson - Asymptotia

News from the Front, XVIII: de Sitter Black Holes and Continuous Heat Engines

[caption id="attachment_19313" align="alignright" width="250"] Hubble photo of jupiter's aurorae.[/caption]Another title for this could be "Making sense of de Sitter black hole thermodynamics", I suppose. What I'm going to tell you about is either a direct correspondence or a series of remarkable inspiring coincidences. Either way, I think you will come away agreeing that there is certainly something interesting afoot.

It is an idea I'd been tossing around in my head from time to time over years, but somehow did not put it all together, and then something else I was working on years later, that was seemingly irrelevant, helped me complete the puzzle, resulting in my new paper, which (you guessed it) I'm excited about.

It all began when I was thinking about heat engines, for black holes in anti-de Sitter, which you may recall me talking about in posts here, here, and here, for example. Those are reciprocating heat engines, taking the system through a cycle that -through various stages- takes in heat, does work, and exhausts some heat, then repeats and repeats. And repeats.

I've told you the story about my realisation that there's this whole literature on quantum heat engines that I'd not known about, that I did not even know of a thing called a quantum heat engine, and my wondering whether my black hole heat engines could have a regime where they could be considered quantum heat engines, maybe enabling them to be useful tools in that arena...(resulting in the paper I described here)... and my delight in combining 18th Century physics with 21st Century physics in this interesting way.

All that began back in 2017. One thing I kept coming back to that really struck me as lovely is what can be regarded as the prototype quantum heat engine. It was recognized as such as far back as 1959!! It is a continuous heat engine, meaning that it does its heat intake and work and heat output all at the same time, as a continuous flow. It is, in fact a familiar system - the three-level maser! (a basic laser also uses the key elements).

A maser can be described as taking in energy as heat from an external source, and giving out energy in the form of heat and work. The work is the desired [...] Click to continue reading this post

The post News from the Front, XVIII: de Sitter Black Holes and Continuous Heat Engines appeared first on Asymptotia.

by Clifford at July 26, 2019 03:44 PM

July 25, 2019

Axel Maas - Looking Inside the Standard Model

Talking about the same thing
In this blog entry I will try to explain my most recent paper. The theme of the paper is rather simply put: You should not compare apple with oranges. The subtlety comes from knowing whether you have an apple or an orange in your hand. This is far less simple than it sounds.

The origin of the problem are once more gauge theories. In gauge theories, we have introduced additional degrees of freedom. And, in fact, we have a choice of how we do this. Of course, our final results will not depend on the choice. However, getting to the final result is not always easy. Thus, ensuring that the intermediate steps are right would be good. But they depend on the choice. But then they are only comparable between two different calculations, if in both calculations the same choice is made.

Now it seems simple at first to make the same choice. Ultimately, it is our choice, right? But this is actually not that easy in such theories, due to their mathematical complexity. Thus, rather than making the choice explicit, the choice is made implicitly. The way how this is done is, again for technical reasons, different for methods. And because of all of these technicalities and the fact that we need to do approximations, figuring out whether the implicit conditions yield the same explicit choice is difficult. This is especially important as the choice modifies the equations describing our auxiliary quantities.

In the paper I test this. If everything is consistent between two particular methods, then the solutions obtained in one method should be a solution to the equations obtained in the other method. Seems a simple enough idea. There had been various arguments in the past which suggested that this should be he case. But there had been more and more pieces of evidence over the last couple of years that led me to think that there was something amiss. So I made this test, and did not rely on the arguments.

And indeed, what I find in the article is that the solution of one method does not solve the equation from the other method. The way how this happens strongly suggests that the implicit choices made are not equivalent. Hence, the intermediate results are different. This does not mean that they are wrong. They are just not comparable. Either method can still yield in itself consistent results. But since neither of the methods are exact, the comparison between both would help reassure that the approximations made make sense. And this is now hindered.

So, what to do now? We would very much like to have the possibility to compare between different methods at the level of the auxiliary quantities. So this needs to be fixed. This can only be achieved if the same choice is made in all the methods. The though question is, in which method we should work on the choice. Should we try to make the same choice as in some fixed of the methods? Should we try to find a new choice in all methods? This is though, because everything is so implicit, and affected by approximations.

At the moment, I think the best way is to get one of the existing choices to work in all methods. Creating an entirely different one for all methods appears to me far too much additional work. And I, admittedly, have no idea what a better starting point would be than the existing ones. But in which method should we start trying to alter the choice? In neither method this seems to be simple. In both cases, fundamental obstructions are there, which need to be resolved. I therefore would currently like to start poking around in both methods. Hoping that there maybe a point in between where the choices of the methods could meet, which is easier than to push all all the way. I have a few ideas, but they will take time. Probably also a lot more than just me.

This investigation also amazes me as the theory where this happens is nothing new. Far from it, it is more than half a century old, older than I am. And it is not something obscure, but rather part of the standard model of particle physics. So a very essential element in our description of nature. It never ceases to baffle me, how little we still know about it. And how unbelievable complex it is at a technical level.

by Axel Maas (noreply@blogger.com) at July 25, 2019 08:27 AM

July 20, 2019

John Baez - Azimuth

Applied Category Theory 2019 Talks

Applied Category Theory 2019 happened last week! It was very exciting: about 120 people attended, and they’re pushing forward to apply category theory in many different directions. The topics ranged from ultra-abstract to ultra-concrete, sometimes in the same talk.

The talks are listed above — click for a more readable version. Below you can read what Jules Hedges and I wrote about all those talks:

• Jules Hedges, Applied Category Theory 2019.

I tend to give terse summaries of the talks, with links to the original papers or slides. Jules tends to give his impressions of their overall significance. They’re nicely complementary.

You can also see videos of some talks, created by Jelle Herold with help from Fabrizio Genovese:

• Giovanni de Felice, Functorial question answering.

• Antonin Delpeuch, Autonomization of monoidal categories.

• Colin Zwanziger, Natural model semantics for comonadic and adjoint modal type theory.

• Nicholas Behr, Tracelets and tracelet analysis Of compositional rewriting systems.

• Dan Marsden, No-go theorems for distributive laws.

• Christian Williams, Enriched Lawvere theories for operational semantics.

• Walter Tholen, Approximate composition.

• Erwan Beurier, Interfacing biology, category theory & mathematical statistics.

• Stelios Tsampas, Categorical contextual reasoning.

• Fabrizio Genovese, idris-ct: A library to do category theory in Idris.

• Michael Johnson, Machine learning and bidirectional transformations.

• Bruno Gavranović, Learning functors using gradient descent

• Zinovy Diskin, Supervised learning as change propagation with delta lenses.

• Bryce Clarke, Internal lenses as functors and cofunctors.

• Ryan Wisnewsky, Conexus AI.

• Ross Duncan, Cambridge Quantum Computing.

• Beurier Erwan, Memoryless systems generate the class of all discrete systems.

• Blake Pollard, Compositional models for power systems.

• Martti Karvonen, A comonadic view of simulation and quantum resources.

• Quanlong Wang, ZX-Rules for 2-qubit Clifford+T quantum circuits, and beyond.

• James Fairbank, A Compositional framework for scientific model augmentation.

• Titoan Carette, Completeness of graphical languages for mixed state quantum mechanics.

• Antonin Delpeuch, A complete language for faceted dataflow languages.

• John van der Wetering, An effect-theoretic reconstruction of quantum mechanics.

• Vladimir Zamdzhiev, Inductive datatypes for quantum programming.

• Octavio Malherbe, A categorical construction for the computational definition of vector spaces.

• Vladimir Zamdzhiev, Mixed linear and non-linear recursive types.

by John Baez at July 20, 2019 03:23 PM

July 11, 2019

Dmitry Podolsky - NEQNET: Non-equilibrium Phenomena

The Environmental Effects of Smartphone Creation and Disposal

For some people, buying a new smartphone is a treat. They get to keep up with the latest digital trend, experience the wonders of the newest technologies, and showcase their delightful digital purchase. Thanks to the ever-advancing field of information technology, smartphones are becoming a necessity rather than an accessory in today’s society. New technology demands the replacement of the old, after all. In 2018, 1.56 billion smartphones were sold worldwide, a testament to the strength of the desire for the latest device. But smartphones carry a hefty cost that’s not included in their price tags. No, the environment pays this price, and the earth keeps paying from the creation to the disposal of a smartphone.

The Green Price

The insatiable market demand for smartphones and similar devices require an equally insatiable production process. This process involves enormous amounts of electricity and mining. The average smartphone requires a small amount of precious metals, such as gold and palladium, in their circuit boards. To supply manufacturers with their rare metals, companies dig deep into the bones of the earth in environmentally-taxing mining efforts. Mining is responsible for up to 95 percent of a smartphone’s carbon footprint. Putting all the components of a smartphone together takes a lot of power, and over the last 10 years, the industry has consumed as much electricity as India uses in a year.

More galling, still, is the fact that manufacturers don’t often design smartphones with a lifespan of more than two years. Intense marketing campaigns and advertising continually influence people to upgrade their devices and buy new phones. As such, people dispose of perfectly serviceable, if outdated, devices alongside damaged units. Their disposal is the second price the environment pays.

The Price of Disposal

broken white phone on hand

Throwing away a smartphone can harm the environment. Electronic devices and appliances like it are responsible for as much as 70 percent of the toxic waste in dumpsites. Although there are businesses and facilities that attempt to recycle these devices, it’s apparently much better for the environment for people to buy pre-owned phones. This because only a small portion of a smartphone is actually recyclable.

Smelters recover the precious palladium and gold components but the process releases a host of more harmful substances into the atmosphere. These deadly vapors include mercury and chloride. Less scrupulous “recyclers” pay shady companies to take their electronic waste to developing countries for cheaper disposal. In these countries, workers inhale and absorb toxic fumes containing cadmium, nickel, and mercury as they disassemble and scavenge smartphone components.

So what’s the smart way to dispose of smartphones? One way is to buy models with longer life expectancies, but that’s still just a stop-gap measure. To truly make an impact, manufacturers should start designing phones that people can reuse. These devices should also be easily upgradable, negating the need for consumers to keep buying new products just to stay compatible with the digital landscape. Without these kinds of far-ranging changes to the industry, the environmental impact of technology will keep exacting a heavy toll on the Earth.

The post The Environmental Effects of Smartphone Creation and Disposal appeared first on None Equilibrium.

by Bertram Mortensen at July 11, 2019 10:20 AM

Dmitry Podolsky - NEQNET: Non-equilibrium Phenomena

7 Ways to Monetise a Blog

A blog, no matter how big or small, can help you earn money when you do things the right way. There are several ways to monetise your blog. One of the first things you should do is to ask the assistance of an SEO firm from your location in Adelaide. This way, you have better chances of getting more traffic. Here are some ways you can monetise your blog:

Coaching Services

Are you an expert of something? This may be about relationships or computer programming you’re good at. Chances are, others want to learn about it. If you’re good at it, share it. Clients will love finding a coach who is an expert on what they have interest in. Pick a niche you’re good at and be a coach for that topic. Make sure you know what you’re talking about so that clients will keep coming back.

Freelance Blogging

A freelance blogger is someone who knows their audience. You should be able to create engaging content, as you’ll be working with different content. As your blog grows, so does your following. Because of that, there’s a strong possibility that many brands will come to you.

Sell Online Courses

Knowing your audience can help determine which courses you can sell them. By knowing their needs and interests, formulate a course that caters specifically to them. You don’t need to have a big following for this one. You can start with only a few followers and work your way up.

Affiliate Marketing

This is a way to sell products or services that aren’t your own through your blog. You can be an affiliate marketer for other products and services. You’ll earn a commission for this. Select products and services that are related to the vision of your blog. Share something that your followers will be interested in. Affiliate marketing is a source of passive income for any blog.

Email Marketing

This works well when you have a list of subscribers. You can use that list to sell you or your affiliate’s products and services. Building a list is hard work, but once you build a strong list, everything else will follow. Build a strong connection with your followers so they’ll trust you enough to subscribe to that list.

Advertisement

You need more than 100,000 visitors a day to get income from this avenue. Having an advertisement in your blog makes you a bit of an authority in the blogging world, but as mentioned, you need traffic for this. It all comes down to how good your contents are and how you market yourself as a blogger.

Sell eBooks

E-book on a tablet

You have a blog so that means you have a knack with words. You can use that to your advantage by selling eBooks. Write something you’re good at or something your audience will love. Research about your topics and plan your writing before you begin. Then, sell your finished products through your blog. Come up with a good marketing strategy. For example, sell three for the price of two. Customers like promos so make use of that knowledge.

Your blog can earn you if you know how to use it to your advantage. Before that, make sure you establish following even in small numbers. Know your craft and advocate it. Keep working hard and be dedicated. The money will come when your audience is having a good time reading your blog.

The post 7 Ways to Monetise a Blog appeared first on None Equilibrium.

by Bertram Mortensen at July 11, 2019 01:00 AM

Dmitry Podolsky - NEQNET: Non-equilibrium Phenomena

Things that Small Businesses Overlook Online

Customers online only want one thing: easy access to information. They want to search for a service and find a business that offers what they need. Naturally, they’ll check out the top results first. The difference between these results is how they attract potential clients to make them click through to their website.

This is a problem with most small businesses. They might have a great physical store, but the lack or mismanagement of information online makes them hard to find. What you need is a strong online presence so that people can find you first. Let’s look at some of the best ways you can bring your business to the top of every customer’s mind:

1. Improve Your Online Presence

It’s one thing to have your business website appear on Google. It’s another for it to be featured on a business directory or a review website. A lot of social media platforms refer to lists for business contact information, while reviews of your business can help boost its SEO ranking.

Try Google Business or the Yellow Pages for starters. Next, search for business directories in your area and make sure that yours is listed. Consistency is essential, so don’t forget to use the same NAP information (name, address, and phone number) across all listings.

Put yourself in the customer’s shoes. You’ll trust a business that has the same contact information across all sites. Clients can quickly think that shops with inconsistent information are already closed.

2. Engage the Locals

store with location pin outside

Proximity and location play a large part in foot and web traffic. People visit the store they searched for 50% of the time, especially if they’re nearby. Take advantage of this by having your store’s location listed in as many places as you can.

If there’s a lot of competition in your area, you’ll have to improve not only your global search results but your local SEO, too. There are several ways to do this. Consulting a digital marketing agency in Virginia Beach can be a good start. Experts can review your business and find ways to promote it.

They can help boost your online rankings and build brand recognition in your community. This is important as building a strong local presence can have positive effects for years to come.

3. Energize Your Content

SEO experts know that regular posting boosts your search ranking. But what’s more important is how engaging that content is. Quality content gets shared more on social media.

Besides blog posts, consider posting photos and videos. Don’t forget to tag the location where you took them. Give customers a tour of your business by sharing what it looks like inside and out.

Your product or service might be helpful to another website or business. Once you have a lot of good content, try approaching sites to link back to you. These links are called backlinks. The more backlinks you have, the more search engines will recognize your domain’s legitimacy. This is called domain authority. The higher your domain’s (your website) authority, the better your search rankings will be.

Good SEO is not an accident. There is no shortcut. Take note of these three essential pointers to help build your SEO rankings. After all that hard work, make sure that people who do visit your website get what they need. Make customers happy and give them an option to leave feedback. Remember that positive reviews are one of the best ways to increase web traffic.

The post Things that Small Businesses Overlook Online appeared first on None Equilibrium.

by Bertram Mortensen at July 11, 2019 01:00 AM

July 08, 2019

Sean Carroll - Preposterous Universe

Spacetime and Geometry: Now at Cambridge University Press

Hard to believe it’s been 15 years since the publication of Spacetime and Geometry: An Introduction to General Relativity, my graduate-level textbook on everyone’s favorite theory of gravititation. The book has become quite popular, being used as a text in courses around the world. There are a lot of great GR books out there, but I felt another one was needed that focused solely on the idea of “teach students general relativity.” That might seem like an obvious goal, but many books also try to serve as reference books, or to put forward a particular idiosyncratic take on the subject. All I want to do is to teach you GR.

And now I’m pleased to announce that the book is changing publishers, from Pearson to Cambridge University Press. Even with a new cover, shown above.

I must rush to note that it’s exactly the same book, just with a different publisher. Pearson was always good to me, I have no complaints there, but they are moving away from graduate physics texts, so it made sense to try to find S&G a safe permanent home.

Well, there is one change: it’s cheaper! You can order the book either from CUP directly, or from other outlets such as Amazon. Copies had been going for roughly $100, but the new version lists for only $65 — and if the Amazon page is to be believed, it’s currently on sale for an amazing $46. That’s a lot of knowledge for a minuscule price. I’d rush to snap up copies for you and your friends, if I were you.

My understanding is that copies of the new version are not quite in stores yet, but they’re being printed and should be there momentarily. Plenty of time for courses being taught this Fall. (Apologies to anyone who has been looking for the book over the past couple of months, when it’s been stuck between publishers while we did the handover.)

Again: it’s precisely the same book. I have thought about doing revisions to produce an actually new edition, but I think about many things, and that’s not a super-high priority right now. Maybe some day.

Thanks to everyone who has purchased Spacetime and Geometry over the years, and said such nice things about it. Here’s to the next generation!

by Sean Carroll at July 08, 2019 08:03 PM

July 03, 2019

John Baez - Azimuth

Applied Category Theory 2019 Program

Bob Coecke, David Spivak, Christina Vasilakopoulou and I are running a conference on applied category theory:

Applied Category Theory 2019, 15–19 July, 2019, Lecture Theatre B of the Department of Computer Science, 10 Keble Road, Oxford.

You can now see the program here, or below. Hope to see you soon!

by John Baez at July 03, 2019 10:25 PM

July 01, 2019

John Baez - Azimuth

Structured Cospans

My grad student Kenny Courser gave a talk at the 4th Symposium on Compositional Structures. He spoke about his work with Christina Vasilakopolou and me. We’ve come up with a theory that can handle a broad class of open systems, from electrical circuits to chemical reaction networks to Markov processes and Petri nets. The idea is to treat open systems as morphisms in a category of a particular kind: a ‘structured cospan category’.

Here is his talk:

• Kenny Courser, Structured cospans.

In July 11th I’m going to talk about structured cospans at the big annual category theory conference, CT2019:

• John Baez, Structured cospans.

I borrowed more than just the title from Kenny’s talk… but since I’m an old guy, they’re giving me time to say more stuff. For full details, try Kenny’s thesis:

• Kenny Courser, The Mathematics of Open Systems from a Double Categorical Perspective.

This thesis is not quite in its final form, so I won’t try to explain it all now. But it’s full of great stuff, so I hope you look at it! If you have any questions or corrections please let us know.

We’ve been working on this project for a couple of years, so there’s a lot to say… but right now let me just tell you what a ‘structured cospan’ is.

Suppose you have any functor L \colon \mathsf{A} \to \mathsf{X}. Then a structured cospan is a diagram like this:

For example if L \colon \mathsf{A} \to \mathsf{X} is the functor from sets to graphs sending each set to the graph with that set of vertices and no edges, a structured cospan looks like this:

It’s a graph with two sets getting mapped into its set of vertices. I call this an open graph. Or if L \colon \mathsf{A} \to \mathsf{X} is the functor from sets to Petri nets sending each set to the Petri having that set of places and nothing else, a structured cospan looks like this:

You can read a lot more about this example here:

• John Baez, Open Petri nets, Azimuth, 15 August 2018.

It illustrates many ideas from the general theory of structured cospans: for example, what we do with them.

You may have heard of a similar idea: ‘decorated cospans’, invented by Brendan Fong. You may wonder what’s the difference!

Kenny’s talk explains the difference pretty well. Basically, decorated cospans that look isomorphic may not be technically isomorphic. For example, if we have an open graph like this:

and its set of edges is \{a,b,c,d\}, this is not isomorphic to the identical-looking open graph whose set of edges is \{b,c,d,e\}. That’s right: the names of the edges matter!

This is an annoying glitch in the formalism. As Kenny’s talk explains, structured cospans don’t suffer from this problem.

My talk at CT2019 explains another way to fix this problem: using a new improved concept of decorated cospan! This new improved concept gives results that match those coming from structured cospan in many cases. Proving this uses some nice theorems proved by Kenny Courser, Christina Vasilakopoulou and also Daniel Cicala.

But I think structured cospans are simpler than decorated cospans. They get the job done more easily in most cases, though they don’t handle everything that decorated cospans do.

I’ll be saying more about structured cospans as time goes on. The basic theorem, in case you’re curious but don’t want to look at my talk, is this:

Theorem. Let \mathsf{A} be a category with finite coproducts, \mathsf{X} a category with finite colimits, and L \colon \mathsf{A} \to \mathsf{X} a functor preserving finite coproducts. Then there is a symmetric monoidal category {}_L \mathsf{Csp}(\mathsf{X}) where:

• an object is an object of \mathsf{A}
• a morphism is an isomorphism class of structured cospans:

Here two structured cospans are isomorphic if there is a commutative diagram of this form:

If you don’t want to work with isomorphism classes of structured cospans, you can use a symmetric monoidal bicategory where the 1-morphisms are actual structured cospans. But following ideas of Mike Shulman, it’s easier to work with a symmetric monoidal double category. So:

Theorem. Let \mathsf{A} be a category with finite coproducts, \mathsf{X} a category with finite colimits, and L \colon \mathsf{A} \to \mathsf{X} a functor preserving finite coproducts. Then there is a symmetric monoidal double category {_L \mathbb{C}\mathbf{sp}(\mathsf{X})} where:

• an object is an object of \mathsf{A}
• a vertical 1-morphism is a morphism of \mathsf{A}
• a horizontal 1-cell is a structured cospan

• a 2-morphism is a commutative diagram

by John Baez at July 01, 2019 12:26 AM

June 19, 2019

Axel Maas - Looking Inside the Standard Model

Creativity in physics
One of the most widespread misconceptions about physics, and other natural sciences, is that they are quite the opposite to art: Precise, fact-driven, logical, and systematic. While art is perceived as emotional, open, creative, and inspired.

Of course, physics has experiments, has data, has math. All of that has to be fitted perfectly together, and there is no room for slights. Logical deduction is central in what we do. But this is not all. In fact, these parts are more like the handiwork. Just like a painter needs to be able to draw a line, a writer needs to be able to write coherent sentences, so we need to be able to calculate, build, check, and infer. But just like the act of drawing a line or writing a sentence is not what we recognize already as art, so is not the solving of an equation physics.

We are able to solve an equation, because we learned this during our studies. We learned, what was known before. Thus, this is our tool set. Like people read books before start writing one. But when we actually do research, we face the fact that nobody knows what is going on. In fact, quite often we do not even know what is an adequate question to pose. We just stand there, baffled, before a couple of observations. That is, where the same act of creativity has to set in as when writing a book or painting a picture. We need an idea, need inspiration, on how to start. And then afterwards, just like the writer writes page after page, we add to this idea various pieces, until we have a hypotheses of what is going on. This is like having the first draft of a book. Then, the real grinding starts, where all our education comes to bear. Then we have to calculate and so on. Just like the writer has to go and fix the draft to become a book.

You may now wonder whether this part of creativity is only limited to the great minds, and at the inception of a whole new step in physics? No, far from it. On the one hand, physics is not the work of lone geniuses. Sure, somebody has occasionally the right idea. But this is usually just the one idea, which is in the end correct, and all the other good ideas, which other people had, did just turn out to be incorrect, and you never hear of them because of this. And also, on the other hand, every new idea, as said above, requires eventually all that what was done before. And more than that. Creativity is rarely borne out of being a hermit. It is often by inspiration due to others. Talking to each other, throwing fragments of ideas at each other, and mulling about consequences together is what creates the soil where creativity sprouts. All those, with whom you have interacted, have contributed to the idea you have being born.

This is, why the genuinely big breakthroughs have often resulted from so-called blue-sky research or curiosity-driven research. It is not a coincidence that the freedom of doing whatever kind of research you think is important is an, almost sacred, privilege of hired scientists. Or should be. Fortunately I am privileged enough, especially in the European Union, to have this privilege. In other places, you are often shackled by all kinds of external influences, down to political pressure to only do politically acceptable research. And this can never spark the creativity you need to make something genuine new. If you are afraid about what you say, you start to restrain yourself, and ultimately anything which is not already established to be acceptable becomes unthinkable. This may not always be as obvious as real political pressure. But if whether you being hired, if your job is safe, starts to depend on it, you start going for acceptable research. Because failure with something new would cost you dearly. And with the currently quite common competitive funding prevalent particularly for non-permanently hired people, this starts to become a serious obstruction.

As a consequence, real breakthrough research can be neither planned nor can you do it on purpose. You can only plan the grinding part. And failure will be part of any creative process. Though you actually never really fail. Because you always learn how something does not work. That is one of the reasons why I strongly want that failures become also publicly available. They are as important to progress as success, by reducing the possibilities. Not to mention the amount of life time of researchers wasted because they fail with them same attempt, not knowing that others failed before them.

And then, perhaps, a new scientific insight arises. And, more often than not, some great technology arises along the way. Not intentionally, but because it was necessary to follow one's creativity. And that is actually where most technological leaps came from. So,real progress in physics, in the end, is made from about a third craftsmanship, a third communication, and a third creativity.

So, after all this general stuff, how do I stay creative?

Well, first of all, I was and am sufficiently privileged. I could afford to start out with just following my ideas, and either it will keep me in business, or I will have to find a non-science job. But this only worked out because of my personal background, because I could have afforded to have a couple of months with no income to find a job, and had an education which almost guarantees me a decent job eventually. And the education I could only afford in this quality because of my personal background. Not to mention that as a white male I had no systemic barriers against me. So, yes, privilege plays a major role.

The other part was that I learned more and more that it is not effort what counts, but effect. Took me years. But eventually, I understood that a creative idea cannot be forced by burying myself in work. Time off is for me as important. It took me until close to the end of my PhD to realize that. But not working overtime, enjoying free days and holidays, is for me as important for the creative process as any other condition. Not to mention that I also do all non-creative chores much more efficiently if well rested, which eventually leaves me with more time to ponder creatively and do research.

And the last ingredient is really exchange. I have had now the opportunity, in a sabbatical, to go to different places and exchange ideas with a lot of people. This gave me what I needed to acquire a new field and have already new ideas for it. It is the possibility to sit down with people for some hours, especially in a nicer and more relaxing surrounding than an office, and just discuss ideas. That is also what I like most about conferences. And one of the reasons I think conferences will always be necessary, even though we need to make going there and back ecologically much more viable, and restrict ourselves to sufficiently close ones until this is possible.

Sitting down over a good cup of coffee or a nice meal, and just discuss, is really jump starting my creativity. Even sitting with a cup of good coffee in a nice cafe somewhere and just thinking does wonders for me in solving problems. And with that, it seems not to be so different for me than for artists, after all.

by Axel Maas (noreply@blogger.com) at June 19, 2019 02:53 PM

June 18, 2019

Marco Frasca - The Gauge Connection

Cracks in the Witten’s index theorem?

In these days, a rather interesting paper (see here for the preprint) appeared on Physical Review Letters. These authors study a Wess-Zumino model for {\cal N}=1, the prototype of any further SUSY model, and show that there exists an anomaly at one loop in perturbation theory that breaks supersymmetry. This is rather shocking as the model is supersymmetric at the classical level and, in agreement with Witten’s index theorem, no breaking of supersymmetry should ever be observed. Indeed, the authors, in the conclusions, correctly ask how the Witten’s theorem copes with this rather strange behavior. Of course, Witten’s theorem is correct and the question comes out naturally and is very much interesting for further studies.

This result is important as I have incurred in a similar situation for the Wess-Zumino model in a couple of papers. The first one (see here and here)  went published and shows how the classical Wess-Zumino model, in a strong coupling regime, breaks supersymmetry. Therefore, I asked a similar question as for the aforementioned case: How quantum corrections recover the Witten’s theorem? The second one is remained a preprint (see here). I tried to send it to Physics Letters B but the referee, without any check of mathematics, just claimed that there was the Witten’s theorem to forbid my conclusions. The Editor asked me to withdraw the paper in view of this identical reason. This was a very strong one. So, I never submited this paper again and just checked the classical case where I was more lucky.

So, my question is still alive: Has supersymmetry in itself the seeds of its breaking?

This is really important in view of the fact that the Minimal Supersymmetric Standard Model (MSSM), now in disgrace after LHC results, can have a dark side in its soft supersymmetry breaking sector. This, in turn, could entail a wrong understanding of where the superpartners could be after the breaking. Anyway, it is really something exciting already at the theoretical level. We are just stressing Witten’s index theorem in search for answers.

by mfrasca at June 18, 2019 03:06 PM

June 14, 2019

Matt Strassler - Of Particular Significance

A Ring of Controversy Around a Black Hole Photo

[Note Added: Thanks to some great comments I’ve received, I’m continuing to add clarifying remarks to this post.  You’ll find them in green.]

It’s been a couple of months since the `photo’ (a false-color image created to show the intensity of radio waves, not visible light) of the black hole at the center of the galaxy M87, taken by the Event Horizon Telescope (EHT) collaboration, was made public. Before it was shown, I wrote an introductory post explaining what the ‘photo’ is and isn’t. There I cautioned readers that I thought it might be difficult to interpret the image, and controversies about it might erupt.EHTDiscoveryM87

So far, the claim that the image shows the vicinity of M87’s black hole (which I’ll call `M87bh’ for short) has not been challenged, and I’m not expecting it to be. But what and where exactly is the material that is emitting the radio waves and thus creating the glow in the image? And what exactly determines the size of the dark region at the center of the image? These have been problematic issues from the beginning, but discussion is starting to heat up. And it’s important: it has implications for the measurement of the black hole’s mass (which EHT claims is that of 6.5 billion Suns, with an uncertainty of about 15%), and for any attempt to estimate its rotation rate.

Over the last few weeks I’ve spent some time studying the mathematics of spinning black holes, talking to my Harvard colleagues who are world’s experts on the relevant math and physics, and learning from colleagues who produced the `photo’ and interpreted it. So I think I can now clearly explain what most journalists and scientist-writers (including me) got wrong at the time of the photo’s publication, and clarify what the photo does and doesn’t tell us.

One note before I begin: this post is long. But it starts with a summary of the situation that you can read quickly, and then comes the long part: a step-by-step non-technical explanation of an important aspect of the black hole ‘photo’ that, to my knowledge, has not yet been given anywhere else.

[I am heavily indebted to Harvard postdocs Alex Lupsasca and Shahar Hadar for assisting me as I studied the formulas and concepts relevant for fast-spinning black holes. Much of what I learned comes from early 1970s papers, especially those by my former colleague Professor Jim Bardeen (see this one written with Press and Teukolsky), and from papers written in the last couple of years, especially this one by my present and former Harvard colleagues.]

What Does the EHT Image Show?

Scientists understand the black hole itself — the geometric dimple in space and time — pretty well. If one knows the mass and the rotation rate of the black hole, and assumes Einstein’s equations for gravity are mostly correct (for which we have considerable evidence, for example from LIGO measurements and elsewhere), then the equations tell us what the black hole does to space and time and how its gravity works.

But for the `photo’, ​that’s not enough information. We don’t get to observe the black hole itself (it’s black, after all!) What the `photo’ shows is a blurry ring of radio waves, emitted from hot material (a plasma of mostly electrons and protons) somewhere around the black hole — material whose location, velocity, and temperature we do not know. That material and its emission of radio waves are influenced by powerful gravitational forces (whose details depend on the rotation rate of the M87bh, which we don’t know yet) and powerful magnetic fields (whose details we hardly know at all.) The black hole’s gravity then causes the paths on which the radio waves travel to bend, even more than a glass lens will bend the path of visible light, so that where things appear in the ‘photo’ is not where they are actually located.

The only insights we have into this extreme environment come from computer simulations and a few other `photos’ at lower magnification. The simulations are based on well-understood equations, but the equations have to be solved approximately, using methods that may or may not be justified. And the simulations don’t tell you where the matter is; they tell you where the material will go, but only after you make a guess as to where it is located at some initial point in time. (In the same sense: computers can predict the national weather tomorrow only when you tell them what the national weather was yesterday.) No one knows for sure how accurate or misleading these simulations might be; they’ve been tested against some indirect measurements, but no one can say for sure what flaws they might have.

However, there is one thing we can certainly say, and it has just been said publicly in a paper by Samuel Gralla, Daniel Holz and Robert Wald.

Two months ago, when the EHT `photo’ appeared, it was widely reported in the popular press and on blogs that the photo shows the image of a photon sphere at the edge of the shadow of the M87bh. (Instead of `shadow’, I suggested the term ‘quasi-silhouette‘, which I viewed as somewhat less misleading to a non-expert.)

Unfortunately, it seems these statements are not true; and this was well-known to (but poorly communicated by, in my opinion) the EHT folks.  This lack of clarity might perhaps annoy some scientists and science-loving non-experts; but does this issue also matter scientifically? Gralla et al., in their new preprint, suggest that it does (though they were careful to not yet make a precise claim.)

The Photon Sphere Doesn’t Exist

Indeed, if you happened to be reading my posts carefully when the `photo’ first appeared, you probably noticed that I was quite vague about the photon-sphere — I never defined precisely what it was. You would have been right to read this as a warning sign, for indeed I wasn’t getting clear explanations of it from anyone. Studying the equations and conversing with expert colleagues, I soon learned why: for a rotating black hole, the photon sphere doesn’t really exist.

But let’s first define what the photon sphere is for a non-rotating black hole! Like the Earth’s equator, the photon sphere is a location, not an object. This location is the surface of an imaginary ball, lying well outside the black hole’s horizon. On the photon sphere, photons (the particles that make up light, radio waves, and all other electromagnetic waves) travel on special circular or spherical orbits around the black hole.

By contrast, a rotating black hole has a larger, broader `photon-zone’ where photons can have special orbits. But you won’t ever see the whole photon zone in any image of a rotating black hole. Instead, a piece of the photon zone will appear as a `photon ring‘, a bright and very thin loop of radio waves. However, the photon ring is not the edge of anything spherical, is generally not perfectly circular, and generally is not even perfectly centered on the black hole.

… and the Photon Ring Isn’t What We See…

It seems likely that the M87bh is rotating quite rapidly, so it has a photon-zone rather than a photon-sphere, and images of it will have a photon ring. Ok, fine; but then, can we interpret EHT’s `photo’ simply as showing the photon ring, blurred by the imperfections in the `telescope’? Although some of the EHT folks have seemed to suggest the answer is “yes”, Gralla et al. suggest the answer is likely “no” (and many of their colleagues have been pointing out the same thing in private.) The circlet of radio waves that appears in the EHT `photo’ is probably not simply a blurred image of M87bh’s photon ring; it probably shows a combination of the photon ring with something brighter (as explained below). That’s where the controversy starts.

…so the Dark Patch May Not Be the Full Shadow…

The term `shadow’ is confusing (which is why I prefer `quasi-silhouette’ in describing it in public contexts, though that’s my own personal term) but no matter what you call it, in its ideal form it is supposed to be an absolutely dark area whose edge is the photon ring. But in reality the perfectly dark area need not appear so dark after all; it may be partly filled in by various effects. Furthermore, since the `photo’ may not show us the photon ring, it’s far from clear that the dark patch in the center is the full shadow anyway. The EHT folks are well aware of this, but at the time the photo came out, many science writers and scientist-writers (including me) were not.

…so EHT’s Measurement of the M87bh’s Mass is Being Questioned

It was wonderful that EHT could make a picture that could travel round the internet at the speed of light, and generate justifiable excitement and awe that human beings could indirectly observe such an amazing thing as a black hole with a mass of several billion Sun-like stars. Qualitatively, they achieved something fantastic in showing that yes, the object at the center of M87 really is as compact and dark as such a black hole would be expected to be! But the EHT telescope’s main quantitative achievement was a measurement of the mass of the M87bh, with a claimed precision of about 15%.

Naively, one could imagine that the mass is measured by looking at the diameter of the dark spot in the black hole ‘photo’, under the assumption that it is the black hole’s shadow. So here’s the issue: Could interpreting the dark region incorrectly perhaps lead to a significant mistake in the mass measurement, and/or an underestimate of how uncertain the mass measurement actually is?

I don’t know.  The EHT folks are certainly aware of these issues; their simulations show them explicitly.  The mass of the M87bh isn’t literally measured by putting a ruler on the ‘photo’ and measuring the size of the dark spot! The actual methods are much more sophisticated than that, and I don’t understand them well enough yet to explain, evaluate or criticize them. All I can say with confidence right now is that these are important questions that experts currently are debating, and consensus on the answer may not be achieved for quite a while.

———————————————————————-

The Appearance of a Black Hole With Nearby Matter

Ok, now I’m going to explain the most relevant points, step-by-step. Grab a cup of coffee or tea, find a comfy chair, and bear with me.

Because fast-rotating black holes are more complicated, I’m going to start illuminating the controversy by looking at a non-rotating black hole’s properties, which is also what Gralla et al. mainly do in their paper. It turns out the qualitative conclusion drawn from the non-rotating case largely applies in the rotating case too, at least in the case of the M87bh as seen from our perspective; that’s important because the M87bh may well be rotating at a very good clip.

A little terminology first: for a rotating black hole there’s a natural definition of the poles and the equator, just as there is for the Earth: there’s an axis of rotation, and the poles are where that axis intersects with the black hole horizon. The equator is the circle that lies halfway between the poles. For a non-rotating black hole, there’s no such axis and no such automatic definition, but it will be useful to define the north pole of the black hole to be the point on the horizon closest to us.

A Single Source of Electromagnetic Waves

Let’s imagine placing a bright light bulb on the same plane as the equator, outside the black hole horizon but rather close to it. (The bulb could emit radio waves or visible light or any other form of electromagnetic waves, at any frequency; for what I’m about to say, it doesn’t matter at all, so I’ll just call it `light’.) See Figure 1. Where will the light from the bulb go?

Some of it, heading inward, ends up in the black hole, while some of it heads outward toward distant observers. The gravity of the black hole will bend the path of the light. And here’s something remarkable: a small fraction of the light, aimed just so, can actually spiral around the black hole any number of times before heading out. As a result, you will see the bulb not once but multiple times!

There will be a direct image — light that comes directly to us — from near the bulb’s true location (displaced because gravity bends the light a bit, just as a glass lens will distort the appearance of what’s behind it.) That path of that light is the orange arrow in Figure 1. But then there will be an indirect image (the green arrow in Figure 1) from light that goes halfway around the black hole before heading in our direction; we will see that image of the bulb on the opposite side of the black hole. Let’s call that the `first indirect image.’ Then there will be a second indirect image from light that orbits the black hole once and comes out near the direct image, but further out; that’s the blue arrow in Figure 1. Then there will be a third indirect image from light that goes around one and a half times (not shown), and so on. In short, Figure 1 shows the paths of the direct, first indirect, and second indirect images of the bulb as they head toward our location at the top of the image.

BHTruthBulb.png

Figure 1: A light bulb (yellow) outside but near the non-rotating black hole’s horizon (in black) can be seen by someone at the top of the image not only through the light that goes directly upward (orange line) — a “direct image” — but also through light that makes partial or complete orbits of the black hole — “indirect images.” The first indirect and second indirect images are from light taking the green and blue paths. For light to make orbits of the black hole, it must travel near the grey-dashed circle that indicates the location of a “photon-sphere.” (A rotating black hole has no such sphere, but when seen from the north or south pole, the light observed takes similar paths to what is shown in this figure.) [The paths of the light rays were calculated carefully using Mathematica 11.3.]

What you can see in Figure 1 is that both the first and second indirect images are formed by light that spends part of its time close to a special radius around the back hole, shown as a dotted line. This imaginary surface, the edge of a ball,  is an honest “photon-sphere” in the case of a non-rotating black hole.

In the case of a rotating black hole, something very similar happens when you’re looking at the black hole from its north (or south) pole; there’s a special circle then too. But that circle is not the edge of a photon-sphere! In general, photons can have special orbits in a wide region, which I called the “photon-zone” earlier, and only a small set of them are on this circle. You’ll see photons from other parts of the photon zone if you look at the black hole not from the poles but from some other angle.

[If you’d like to learn a bit more about the photon zone, and you have a little bit of knowledge of black holes already, you can profit from exploring this demo by Professor Leo Stein: https://duetosymmetry.com/tool/kerr-circular-photon-orbits/ ]

Back to the non-rotating case: What our camera will see, looking at what is emitted from the light bulb, is shown in Figure 2: an infinite number of increasingly squished `indirect’ images, half on one side of the black hole near the direct image, and the other half on the other side. What is not obvious, but true, is that only the first of the indirect images is large and bright; this is one of Gralla et al.‘s main points. We can, therefore, separate the images into the direct image, the first indirect image, and the remaining indirect images. The total amount of light coming from the direct image and the first indirect image can be large, but the total amount of light from the remaining indirect images is typically (according to Gralla et al.) less than 5% of the light from the first indirect image. And so, unless we have an extremely high-powered camera, we’ll never pick those other images up. Let’s therefore focus our attention on the direct image and the first indirect image.

BHObsvBulb3.png

Figure 2: What the drawing in Figure 1 actually looks like to the observer peering toward the black hole; all the indirect images lie at almost exactly the same distance from the black hole’s center.

WARNING (since this seems to be a common confusion):

IN ALL MY FIGURES IN THIS POST, AS IN THE BLACK HOLE `PHOTO’ ITSELF, THE COLORS OF THE IMAGES ARE CHOSEN ARBITRARILY (as explained in my first blog post on this subject.) THE `PHOTO’ WAS TAKEN AT A SINGLE, NON-VISIBLE FREQUENCY OF ELECTROMAGNETIC WAVES: EVEN IF WE COULD SEE THAT TYPE OF RADIO WAVE WITH OUR EYES, IT WOULD BE A SINGLE COLOR, AND THE ONLY THING THAT WOULD VARY ACROSS THE IMAGE IS BRIGHTNESS. IN THIS SENSE, A BLACK AND WHITE IMAGE MIGHT BE CLEARER CONCEPTUALLY, BUT IT IS HARDER FOR THE EYE TO PROCESS.

A Circular Source of Electromagnetic Waves

Proceeding step by step toward a more realistic situation, let’s replace our ordinary bulb by a circular bulb (Figure 3), again set somewhat close to the horizon, sitting in the plane that contains the equator. What would we see now?

BHTruthCirc2.png

Figure 3: if we replace the light bulb with a circle of light, the paths of the light are the same as in Figure 1, except now for each point along the circle. That means each direct and indirect image itself forms a circle, as shown in the next figure.

That’s shown in Figure 4: the direct image is a circle (looking somewhat larger than it really is); outside it sits the first indirect image of the ring; and then come all the other indirect images, looking quite dim and all piling up at one radius. We’re going to call all those piled-up images the “photon ring”.

BHObsvCirc3.png

Figure 4: The circular bulb’s direct image is the bright circle, but a somewhat dimmer first indirect image appears further out, and just beyond one finds all the other indirect images, forming a thin `photon ring’.

Importantly, if we consider circular bulbs of different diameter [yellow, red and blue in Figure 5], then although the direct images reflect the differences in the bulbs’ diameters (somewhat enlarged by lensing), the first indirect images all are about the same diameter, just a tad larger or smaller than the photon ring.  The remaining indirect images all sit together at the radius of the photon ring.

BH3Circ4.png

Figure 5: Three bulbs of different diameter (yellow, blue, red) create three distinct direct images, but their first indirect images are located much closer together, and very close to the photon ring where all their remaining indirect images pile up.

These statements are also essentially true for a rotating black hole seen from the north or south pole; a circular bulb generates a series of circular images, and the indirect images all pile more or less on top of each other, forming a photon ring. When viewed off the poles, the rotating black hole becomes a more complicated story, but as long as the viewing angle is small enough, the changes are relatively minor and the picture is qualitatively somewhat similar.

A Disk as a Source of Electromagnetic Waves

And what if you replaced the circular bulb with a disk-shaped bulb, a sort of glowing pancake with a circular hole at its center, as in Figure 7? That’s relevant because black holes are thought to have `accretion disks’ made of material orbiting the black hole, and eventually spiraling in. The accretion disk may well be the dominant source emitting radio waves at the M87bh. (I’m showing a very thin uniform disk for illustration, but a real accretion disk is not uniform, changes rapidly as clumps of material move within it and then spiral into the black hole, and may be quite thick — as thick as the black hole is wide, or even thicker.)

Well, we can think of the disk as many concentric circles of light placed together. The direct images of the disk (shown in Figure 6 left, on one side of the disk, as an orange wash) would form a disk in your camera, the dim red region in Figure 6 right; the hole at its center would appear larger than it really is due to the bending caused by the black hole’s gravity, but the shape would be similar. However, the indirect images would all pile up in almost the same place from your perspective, forming a bright and quite thin ring, the bright yellow circle in Figure 6 right. (The path of the disk’s first indirect image is shown in Figure 6 left, going halfway about the black hole as a green wash; notice how it narrows as it travels, which is why it appears as a narrow ring in the image at right.) This circle — the full set of indirect images of the whole disk — is the edge of the photon-sphere for a non-rotating black hole, and the circular photon ring for a rotating black hole viewed from its north or south pole.

BHDisk2.png

Figure 6: A glowing disk of material (note it does not touch the black hole) looks like a version of Figure 5 with many more circular bulbs. The direct image of the disk forms a disk (illustrated at left, for a piece of the disk, as an orange wash) while the first indirect image becomes highly compressed (illustrated, for a piece of the disk, as a green wash) and is seen as a narrow circle of bright light.  (It is expected that the disk is mostly transparent in radio waves, so the indirect image can pass through it.) That circle, along with the other indirect images, forms the photon ring. In this case, because the disk’s inner edge lies close to the black hole horizon, the photon ring sits within the disk’s direct image, but we’ll see a different example in Figure 9.

[Gralla et al. call the first indirect image the `lensed ring’ and the remaining indirect images, currently unobservable at EHT, the `photon ring’, while EHT refers to all the indirect images as the `photon ring’. Just letting you know in case you hear `lensed ring’ referred to in future.]

So the conclusion is that if we had a perfect camera, the direct image of a disk makes a disk, but the indirect images (mainly just the first one, as Gralla et al. emphasize) make a bright, thin ring that may be superposed upon the direct image of the disk, depending on the disk’s shape.

And this conclusion, with some important adjustments, applies also for a spinning black hole viewed from above its north or south pole — i.e., along its axis of rotation — or from near that axis; I’ll mention the adjustments in a moment.

But EHT is not a perfect camera. To make the black hole image, technology had to be pushed to its absolute limits. Someday we’ll see both the disk and the ring, but right now, they’re all blurred together. So which one is more important?

From a Blurry Image to Blurry Knowledge

What does a blurry camera do to this simple image? You might think that the disk is so dim and the ring so bright that the camera will mainly show you a blurry image of the bright photon ring. But that’s wrong. The ring isn’t bright enough. A simple calculation reveals that the ​photo will show mainly the disk, not the photon ring! This is shown in Figure 9, which you can compare with the Black Hole `photo’ (Figure 10). (Figure 9 is symmetric around the ring, but the photo is not, for multiple reasons — Doppler-like effect from rotation, viewpoint off the rotation axis, etc. — which I’ll have to defer til another post.)

More precisely, the ring and disk blur together, but the brightness of the image is dominated by the disk, not the ring.

BHBlurDisk_a1_2.png

Figure 7: At left is repeated the image in Figure 6, as seen in a perfect camera, while at right the same image is shown when observed using a camera with imperfect vision. The disk and ring blur together into a single thick ring, whose brightness is dominated by the disk. Note that the shadow — the region surrounded by the yellow photon ring — is not the same as the dark patch in the right-hand image; the dark patch is considerably smaller than the shadow.

Let’s say that again: the black hole `photo’ may mainly show the M87bh’s accretion disk, with the photon ring contributing only some of the light, and therefore the photon ring does not completely and unambiguously determine the radius of the observed dark patch in the `photo​.’ In general, the patch could be considerably smaller than what is usually termed the `shadow’ of the black hole.

M87BH_Vicinity_Photo_2a.png

Figure 8: (Left) We probably observe the M87bh at a small angle off its south pole. Its accretion disk has an unknown size and shape — it may be quite thick and non-uniform — and it may not even lie at the black hole’s equator. The disk and the black hole interact to create outward-going jets of material (observed already many years ago but not clearly visible in the EHT ‘photo’.) (Right) The EHT `photo’ of the M87bh (taken in radio waves and shown in false color!) Compare with Figure 7; the most important difference is that one side of the image is brighter than the other. This likely arises from (a) our view being slightly off from the south pole, combined with (b) rotation of the black hole and its disk, and (c) possibly other more subtle issues.

This is important. The photon ring’s diameter, and thus the width of the `shadow’ too, barely depend on the rotation rate of the black hole; they depend almost exclusively on the black hole’s mass. So if the ring in the photo were simply the photon ring of the M87bh, you’d have a very simple way to measure the black hole’s mass without knowing its rotation rate: you’d look at how large the dark patch is, or equivalently, the diameter of the blurry ring, and that would give you the answer to within 10%. But it’s nowhere near so simple if the blurry ring shows the accretion disk, because the accretion disk’s properties and appearance can vary much more than the photon ring; they can depend strongly on the black hole’s rotation rate, and also on magnetic fields and other details of the black hole’s vicinity.

The Important Role of Rotation

If we conclude that EHT is seeing a mix of the accretion disk with the photon ring, with the former dominating the brightness, then this makes EHT’s measurement of the M87bh’s mass more confusing and even potentially suspect. Hence: controversy. Is it possible that EHT underestimated their uncertainties, and that their measurement of the black hole mass has more ambiguities, and is not as precise, as they currently claim?

Here’s where the rotation rate is important. Despite what I showed (for pedagogical simplicity) in Figure 7, for a non-rotating black hole the accretion disk’s central gap is actually expected to lie outside the photon ring; this is shown at the top of Figure 9.  But  the faster the black hole rotates, the smaller this central gap is expected to be, to the point that for a fast-rotating black hole the gap will lie inside the photon ring, as shown at the bottom of Figure 9. (This tendency is not obvious; it requires understanding details of the black hole geometry.) And if that is true, the dark patch in the EHT image may not be the black hole’s full shadow (i.e. quasi-silhouette), which is the region inside the photon ring. It may be just the inner portion of it, with the outer portion obscured by emission from the accretion disk.

The effect of blurring in the two cases of slow (or zero) and fast rotation are illustrated in Figure 9, where the photon ring’s size is taken to be the same in each case but the disk’s inner edge is close in or far out. (The black holes, not illustrated since they aren’t visible anyway, differ in mass by about 10% in order to have the photon ring the same size.) This shows why the size of the dark patch can be quite different, depending on the disk’s shape, even when the photon ring’s size is the same.

BHBlurDisk_a0_a1_3.png

Figure 9: Comparing the appearance of slightly more realistically-shaped disks around slowly rotating or non-rotating black holes (top) to those around fast-rotating black holes (bottom) of the same mass, as seen from the north or south pole. (Left) the view in a perfect camera; (right) rough illustration of the effect of blurring in the current version of the EHT. The faster the black hole is spinning, the smaller the central gap in the accretion disk is likely to be. No matter what the extent of the accretion disk (dark red), the photon ring (yellow) remains at roughly the same location, changing only by 10% between a non-rotating black hole and a maximally rotating black hole of the same mass. But blurring in the camera combines the disk and photon ring into a thick ring whose brightness is dominated by the disk rather than the ring, and which can therefore be of different size even though the mass is the same. This implies that the radius of the blurry ring in the EHT `photo’, and the size of the dark region inside it, cannot by themselves tell us the black hole’s mass; at a minimum we must also know the rotation rate (which we do not.)

Gralla et al. subtly raise these questions but are careful not to overstate their case, perhaps because they have not yet completed their study of rotating black holes. But the question is now in the air.

I’m interested to hear what the EHT folks have to say about it, as I’m sure they have detailed arguments in favor of their procedures. In particular, EHT’s simulations show all of the effects mentioned above; there’s none of this of which they are unaware. (In fact, the reason I know my illustrations above are reasonable is partly because you can see similar pictures in the EHT papers.) As long as the EHT folks correctly accounted for all the issues, then they should have been able to properly measure the mass and estimate their uncertainties correctly. In fact, they don’t really use the photo itself; they use more subtle techniques applied to their telescope data directly. Thus it’s not enough to argue the photo itself is ambiguous; one has to argue that EHT’s more subtle analysis methods are flawed. No one has argued that yet, as far as I am aware.

But the one thing that’s clear right now is that science writers almost uniformly got it wrong [because the experts didn’t explain these points well] when they tried to describe the image two months ago. The `photo’ probably does not show “a photon ring surrounding a shadow.” That would be nice and simple and impressive-sounding, since it refers to fundamental properties of the black hole’s warping effects on space. But it’s far too glib, as Figures 7 and 9 show. We’re probably seeing an accretion disk supplemented by a photon ring, all blurred together, and the dark region may well be smaller than the black hole’s shadow.

(Rather than, or in addition to, the accretion disk, it is also possible that the dominant emission in the photo comes from the inner portion of one of the jets that emerges from the vicinity of the black hole; see Figure 8 above. This is another detail that makes the situation more difficult to interpret, but doesn’t change the main point I’m making.)

Someday in the not distant future, improved imaging should allow EHT to separately image the photon ring and the disk, so both can be observed easily, as in the left side of Figure 9. Then all these questions will be answered definitively.

Why the Gargantua Black Hole from Interstellar is Completely Different

Just as a quick aside, what would you see if an accretion disk were edge-on rather than face-on? Then, in a perfect camera, you’d see something like the famous picture of Gargantua, the black hole from the movie Interstellar — a direct image of the front edge of the disk, and a strongly lensed indirect image of the back side of the disk, appearing both above and below the black hole, as illustrated in Figure 11. And that leads to the Gargantua image from the movie, also shown in Figure 11. Notice the photon ring (which is, as I cautioned you earlier, off-center!)   [Note added: this figure has been modified; in the original version I referred to the top and bottom views of the disk’s far side as the  “1st indirect image”, but as pointed out by Professor Jean-Pierre Luminet, that’s not correct terminology here.]

BHGarg4.png

Figure 10: The movie Interstellar features a visit to an imaginary black hole called Gargantua, and the simulated images in the movie (from 2014) are taken from near the equator, not the pole. As a result, the direct image of the disk cuts across the black hole, and indirect images of the back side of the disk are seen above and below the black hole. There is also a bright photon ring, slightly off center; this is well outside the surface of the black hole, which is not visible. A real image would not be symmetric left-to-right; it would be brighter on the side that is rotating toward the viewer.  At the bottom is shown a much more realistic visual image (albeit not so good quality) from 1994 by Jean-Alain Marck, in which this asymmetry can be seen clearly.

However, the movie image leaves out an important Doppler-like effect (which I’ll explain someday when I understand it 100%). This makes the part of the disk that is rotating toward us bright, and the part rotating away from us dim… and so a real image from this vantage point would be very asymmetric — bright on the left, dim on the right — unlike the movie image.  At the suggestion of Professsor Jean-Pierre Luminet I have added, at the bottom of Figure 10, a very early simulation by Jean-Alain Marck that shows this effect.

I mention this because a number of expert science journalists incorrectly explained the M87 image by referring to Gargantua — but that image has essentially nothing to do with the recent black hole `photo’. M87’s accretion disk is certainly not edge-on. The movie’s Gargantua image is taken from the equator, not from near the pole.

Final Remarks: Where a Rotating Black Hole Differs from a Non-Rotating One

Before I quit for the week, I’ll just summarize a few big differences for fast-rotating black holes compared to non-rotating ones.

1) As I’ve just emphasized, what a rotating black hole looks like to a distant observer depends not only on where the matter around the black hole is located but also on how the black hole’s rotation axis is oriented relative to the observer. A pole observer, an equatorial observer, and a near-pole observer see quite different things. (As noted in Figure 8, we are apparently near-south-pole observers for M87’s black hole.)

Let’s assume that the accretion disk lies in the same plane as the black hole’s equator — there are some reasons to expect this. Even then, the story is complex.

2) As I mentioned above, instead of a photon-sphere, there is a ‘photon-zone’ — a region where specially aimed photons can travel round the black hole multiple times. For high-enough spin (greater than about 80% of maximum as I recall), an accretion disk’s inner edge can lie within the photon zone, or even closer to the black hole than the photon zone; and this can cause a filling-in of the ‘shadow’.

3) Depending on the viewing angle, the indirect images of the disk that form the photon ring may not be a circle, and may not be concentric with the direct image of the disk. Only when viewed from along the rotation axis (i.e., above the north or south pole) will the direct and indirect images of the disk all be circular and concentric. We’re not viewing the M87bh on its axis, and that further complicates interpretation of the blurry image.

4) When the viewing angle is not along the rotation axis the image will be asymmetric, brighter on one side than the other. (This is true of EHT’s `photo’.) However, I know of at least four potential causes of this asymmetry, any or all of which might play a role, and the degree of asymmetry depends on properties of the accretion disk and the rotation rate of the black hole, both of which are currently unknown. Claims about the asymmetry made by the EHT folks seem, at least to me, to be based on certain assumptions that I, at least, cannot currently check.

Each of these complexities is a challenge to explain, so I’ll give both you and I a substantial break while I figure out how best to convey what is known (at least to me) about these issues.

by Matt Strassler at June 14, 2019 12:15 PM

June 11, 2019

Georg von Hippel - Life on the lattice

Looking for guest bloggers to cover LATTICE 2019
My excellent reason for not attending LATTICE 2018 has become a lot bigger, much better at many things, and (if possible) even more beautiful — which means I won't be able to attend LATTICE 2019 either (I fully expect to attend LATTICE 2020, though). So once again I would greatly welcome guest bloggers willing to cover LATTICE 2019; if you are at all interested, please send me an email and we can arrange to grant you posting rights.

by Georg v. Hippel (noreply@blogger.com) at June 11, 2019 10:28 AM

Georg von Hippel - Life on the lattice

Book Review: "Lattice QCD — Practical Essentials"
There is a new book about Lattice QCD, Lattice Quantum Chromodynamics: Practical Essentials by Francesco Knechtli, Michael Günther and Mike Peardon. At 140 pages, this is a pretty slim volume, so it is obvious that it does not aim to displace time-honoured introductory textbooks like Montvay and Münster, or the newer books by Gattringer and Lang or DeGrand and DeTar. Instead, as suggested by the subtitle "Practical Essentials", and as said explicitly by the authors in their preface, this book aims to prepare beginning graduate students for their practical work in generating gauge configurations and measuring and analysing correlators.

In line with this aim, the authors spend relatively little time on the physical or field-theoretic background; while some more advanced topics such as the Nielsen-Ninomiya theorem and the Symanzik effective theory are touched upon, the treatment of foundational topics is generally quite brief, and some topics, such as lattice perturbation theory or non-perturbative renormalization, are omitted altogether. The focus of the book is on Monte Carlo simulations, for which both the basic ideas and practically relevant algorithms — heatbath and overrelaxation for pure gauge fields, and hybrid Monte Carlo (HMC) for dynamical fermions — are described in some detail, including the RHMC algorithm and advanced techniques such as determinant factorizations, higher-order symplectic integrators, and multiple-timescale integration. The techniques from linear algebra required to deal with fermions are also covered in some detail, from the basic ideas of Krylov-space methods through concrete descriptions of the GMRES and CG algorithms, along with such important preconditioners as even-odd and domain decomposition, to the ideas of algebraic multigrid methods. Stochastic estimation of all-to-all propagators with dilution, the one-end trick and low-mode averaging are explained, as are techniques for building interpolating operators with specific quantum numbers, gauge link and quark field smearing, and the use of the variational method to extract hadronic mass spectra. Scale setting, the Wilson flow, and Lüscher's method for extracting scattering phase shifts are also discussed briefly, as are the basic statistical techniques for data analysis. Each chapter contains a list of references to the literature covering both original research articles and reviews and textbooks for further study.

Overall, I feel that the authors succeed very well at their stated aim of giving a quick introduction to the methods most relevant to current research in lattice QCD in order to let graduate students hit the ground running and get to perform research as quickly as possible. In fact, I am slightly worried that they may turn out to be too successful, since a graduate student having studied only this book could well start performing research, while having only a very limited understanding of the underlying field-theoretical ideas and problems (a problem that already exists in our field in any case). While this in no way detracts from the authors' achievement, and while I feel I can recommend this book to beginners, I nevertheless have to add that it should be complemented by a more field-theoretically oriented traditional textbook for completeness.

___
Note that I have deliberately not linked to the Amazon page for this book. Please support your local bookstore — nowadays, you can usually order online on their websites, and many bookstores are more than happy to ship books by post.

by Georg v. Hippel (noreply@blogger.com) at June 11, 2019 10:27 AM

June 10, 2019

Matt Strassler - Of Particular Significance

Minor Technical Difficulty with WordPress

Hi all — sorry to bother you with an issue you may not even have noticed, but about 18 hours ago a post of mine that was under construction was accidentally published, due to a WordPress bug.  Since it isn’t done yet, it isn’t readable (and has no figures yet) and may still contain errors and typos, so of course I tried to take it down immediately.  But it seems some of you are still getting the announcement of it or are able to read parts of it.  Anyway, I suggest you completely ignore it, because I’m not done working out the scientific details yet, nor have I had it checked by my more expert colleagues; the prose and perhaps even the title may change greatly before the post comes out later this week.  Just hang tight and stay tuned…

by Matt Strassler at June 10, 2019 11:43 PM

Matt Strassler - Of Particular Significance

The Black Hole Photo: Controversy Begins To Bubble Up

It’s been a couple of months since the `photo’ (a false-color image created to show the intensity of radio waves, not visible light) of the the black hole at the center of the galaxy M87, taken by the Event Horizon Telescope (EHT) collaboration, was made public.  Before it was shown, I wrote an introductory post explaining what the ‘photo’ is and isn’t.  There I cautioned readers that I thought it might be difficult to interpret the image, and controversies about it might erupt. This concern seems to have been warranted.  This is the first post of several in which I’ll explain the issue as I see it.

So far, the claim that the image shows the vicinity of M87’s black hole (which I’ll call `M87bh’ for short) has not been challenged, and I’m not expecting it to be. But what and where exactly is the material that is emitting the radio waves and thus creating the glow in the image? And what exactly determines the size of the dark region at the center of the image? That’s been a problematic issue from the beginning, but discussion is starting to heat up.  And it’s important: it has implications for the measurement of the black hole’s mass, and of any attempt to estimate its rotation rate.

Over the last few weeks I’ve spent some time studying the mathematics of spinning black holes, talking to my Harvard colleagues who are world’s experts on the relevant math and physics, and learning from colleagues who produced the `photo’ and interpreted it.  So I think I can now clearly explain what most journalists and scientist-writers (including me) got wrong at the time of the photo’s publication, and clarify what the photo does and doesn’t tell us.

[I am heavily indebted to Harvard postdocs Alex Lupsasca and Shahar Hadar for assisting me as I studied the formulas and concepts relevant for fast-spinning black holes. Much of what I learned comes from early 1970s papers, especially those by my former colleague Professor Jim Bardeen (see this one written with Press and Teukolsky), and from papers written in the last couple of years, especially this one by my present and former Harvard colleagues.]

What does the EHT Image Show?

Scientists understand the black hole itself — the geometric dimple in space and time — pretty well.  If one knows the mass and the rotation rate of the black hole, and assumes Einstein’s equations for gravity are mostly correct (for which we have considerable evidence, for example from LIGO measurements and elsewhere), then the equations tell us what the black hole does to space and time and how its gravity works.

But for the `photo’, ​that’s not enough information.  We don’t get to observe black hole itself (it’s black, after all!)   What the `photo’ shows is a blurry ring of radio waves, emitted from hot material (mostly electrons and protons) somewhere around the black hole — material whose location, velocity, and temperature we do not know. That material and its emission of radio waves are influenced by powerful gravitational forces (whose details depend on the rotation rate of the M87bh, which we don’t know yet) and powerful magnetic fields (whose details we hardly know at all.)  The black hole then bends the paths of the radio waves extensively, even more than does a glass lens, so that where things appear in the image is not where they are actually located.

The only insights we have into this extreme environment come from computer simulations and a few other `photos’ at lower magnification. The simulations are based on well-understood equations, but the equations have to be solved approximately, using methods that may or may not be justified. And the simulations don’t tell you where the matter is; they tell you where the material will go, but only after you make a guess as to where it is located at some initial point in time.  (In the same sense: computers can predict the national weather tomorrow only when you tell them what the national weather was yesterday.) No one knows for sure how accurate or misleading these simulations might be; they’ve been tested against some indirect measurements, but no one can say for sure what flaws they might have.

However, there is one thing we can certainly say, and a paper by Gralla, Holz and Wald has just said it publicly.

When the EHT `photo’ appeared, it was widely reported that it shows the image of a photon sphere at the edge of the shadow (or ‘quasi-silhouette‘, a term I suggested as somewhat less misleading) of the M87bh.

[Like the Earth’s equator, the photon sphere is a location, not an object.  Photons (the particles that make up light, radio waves, and all other electromagnetic radiation) that move along the photon sphere have special, spherical orbits around the black hole.]

Unfortunately, it seems likely that these statements are incorrect; and Gralla et al. have said almost as much in their new preprint (though they were careful not to make a precise claim.)

 

The Photon Sphere Doesn’t Exist

Indeed, if you happened to be reading my posts carefully back then, you probably noticed that I was quite vague about the photon-sphere — I never defined precisely what it was.  You would have been right to read this as a warning sign, for indeed I wasn’t getting clear explanations of it from anyone. A couple of weeks later, as I studied the equations and conversed with colleagues, I learned why; for a rotating black hole, the photon sphere doesn’t really exist.  There’s a broad photon-zone' where photons can have special orbits, but you won't ever see the whole photon zone in an image of a rotating black hole.  Instead a piece of the photon zone will show up asphoton ring, a bright thin loop of radio waves.

But this ring is not the edge of anything spherical, is generally not perfectly circular, and is not even perfectly centered on the black hole.

… and the Photon Ring Isn’t What We See…

It seems likely that the M87bh is rotating quite rapidly, so it probably has no photon-sphere.  But does it show a photon ring?  Although some of the EHT folks seemed to suggest the answer was ‘yes’, Gralla et al. suggest the answer is likely `no’ (and my Harvard colleagues were finding the same thing.)  It seems unlikely that the circlet of radio waves that appears in the EHT `photo’ is really an image of M87bh’s photon ring anyway; it’s probably something else.  That’s where controversy starts.

…so the Dark Patch is Probably Not the Full Shadow

The term shadow' is confusing (which is why I prefer quasi-silhouette’) but no matter what you call it, in its ideal form it is a perfectly dark area whose edge is the photon ring.    But in reality the perfectly dark area need not appear so dark after all; it may be filled in by various effects.  Furthermore, since the `photo’ may not show us the photon ring, it’s far from clear that the dark patch in the center is the full shadow anyway.

Step-By-Step Approach

To explain these points will take some time and care, so I’m going to spread the explanation out over several blog posts.  Otherwise it’s just too much information too fast, and I won’t do a good job writing it down.  So bear with me… expect at least three more posts, probably four, and even then there will still be important issues to return to in future.

The Appearance of a  Black Hole With Nearby Matter

Because fast-rotating black holes are complicated, I’m going to illuminate the controversy using a non-rotating black hole’s properties, which is also what Gralla et al. mainly do in their paper. It turns out the qualitative conclusion drawn from the non-rotating case largely applies in the rotating case too, at least in the case of the M87BH as seen from our perspective; that’s important because the M87BH is probably rotating at a very good clip. (At the end of this post I’ll briefly describe some key differences between the appearance of non-rotating black holes, rotating black holes observed along the rotation axis, and rotating black holes observed just a bit off the rotation axis.)

A little terminology first: for a rotating black hole there’s a natural definition of the poles and the equator, just as there is for the Earth: there’s an axis of rotation, and the poles are where that axis intersects with the black hole horizon. The equator is the circle that lies halfway between the poles. For a non-rotating black hole, there’s no such axis and no such automatic definition, but it will be useful to define the north pole of the black hole to be the point on the horizon closest to us.

A Single Source of Electromagnetic Waves

Let’s imagine placing a bright light bulb on the same plane as the equator, outside the black hole horizon but rather close to it. (The bulb could emit radio waves or visible light or any other form of electromagnetic waves, at any frequency; for what I’m about to say, it doesn’t matter at all, so I’ll just call it `light’.)  See Figure 1.  Where would the light from the bulb go?

Some of it, heading inward, ends up in the black hole, while some of it heads outward toward distant observers. The gravity of the black hole will bend the path of the light. And here’s something remarkable: a small fraction of the light, aimed just so, can actually spiral around the black hole any number of times before heading out. As a result, you will see the bulb not once but multiple times!

There will be a direct image — light that comes directly to us — from near the bulb’s true location (displaced because gravity bends the light a bit, just as a glass lens will distort the appearance of what’s behind it.) That’s the orange arrow in Figure 1.  But then there will be an indirect image from light that goes halfway (the green arrow in Figure 1) around the black hole before heading in our direction; we will see that image of the bulb on the opposite side of the black hole. Let’s call that the `first indirect image.’ Then there will be a second indirect image from light that orbits the black hole once and comes out near the direct image, but further out; that’s the blue arrow in Figure 1. Then there will be a third indirect image from light that goes around one and a half times (not shown), and so on. Figure 1 shows the paths of the direct, first indirect, and second indirect images of the bulb as they head toward our location at the top of the image.

What you can see in Figure 1 is that both the first and second indirect images are formed by light (er, radio waves) that spends part of its time close to a special radius around the back hole, shown as a dotted line. This, in the case of a non-rotating black hole, is an honest “photon-sphere”.

In the case of a rotating black hole, something very similar happens when you’re looking at the black hole from its north pole; there’s a special circle then too.  But that circle is not the edge of a photon-sphere!  In general, photons can orbit in a wide region, which I’ll call the “photon-zone.” You’ll see photons from other parts of the photon zone if you look at the black hole not from the north pole but from some other angle.

What our radio-wave camera will see, looking at what is emitted from the light bulb, is shown in Figure 2: an infinite number of increasingly squished `indirect’ images, half on one side of the black hole near the direct image, and the other half on the other side. What is not obvious, but true, is that only the first of the indirect images is bright; this is one of Gralla et al’s main points. We can, therefore, separate the images into the direct image, the first indirect image, and the remaining indirect images. The total amount of light coming from the direct image and the first indirect image can be large, but the total amount of light from the remaining indirect images is typically (according to Gralla et al.) less than 5% of the light from the first indirect image. And so, unless we have an extremely high resolution camera, we’ll never pick those other images up up. Consequently, all we can really hope to detect with something like EHT is the direct image and the first indirect image.

WARNING (since this seems to be a common confusion even after two months):

IN ALL MY FIGURES IN THIS POST, AS IN THE BLACK HOLE `PHOTO’ ITSELF, THE COLORS OF THE IMAGE ARE CHOSEN ARBITRARILY (as explained in my first blog post on this subject.) THE `PHOTO’ WAS TAKEN AT A SINGLE, NON-VISIBLE FREQUENCY OF ELECTROMAGNETIC WAVES: EVEN IF WE COULD SEE THAT TYPE OF RADIO WAVE WITH OUR EYES, IT WOULD BE A SINGLE COLOR, AND THE ONLY THING THAT WOULD VARY ACROSS THE IMAGE IS BRIGHTNESS. IN THIS SENSE, A BLACK AND WHITE IMAGE MIGHT BE CLEARER CONCEPTUALLY, BUT IT IS HARDER FOR THE EYE TO PROCESS.

A Circular Source of Electromagnetic Waves

Let’s replace our ordinary bulb by a circular bulb (Figure 3), again set somewhat close to the horizon, sitting in the plane that contains the equator. What would we see now? Figure 4: The direct image is a circle (looking somewhat larger than it really is); outside it sits the first indirect image of the ring; and then come all the other indirect images, looking quite dim and all piling up at one radius. We’re going to call all those piled-up images the “photon ring”.

Importantly, if we replace that circular bulb [shown yellow in Figure 5] by one of a larger or smaller radius [shown blue in Figure 5], then (Figure 6) the inner direct image would look larger or smaller to us, but the indirect images would barely move. They remain very close to the same size no matter how big a circular bulb we chose would barely move!

A Disk as a Source of Electromagnetic Waves

And what if you replaced the circular bulb with a disk-shaped bulb, a sort of glowing pancake with a circular hole at its center, as in Figure 7? That’s relevant because black holes are thought to have `accretion disks’ of material (possibly quite thick — I’m showing a very thin one for illustration, but they can be as thick as the black hole is wide, or even thicker) that orbit them. The accretion disk may be the source of the radio waves at M87’s black hole. Well, we can think of the disk as many concentric circles of light placed together. The direct images of the disk (shown on one side of the disk as an orange wash) would form a disk in your camera (Figure 8); the hole at its center would appear larger than it really is due to the bending caused by the black hole’s gravity, but the shape would be the same. However, the indirect images (the first of which is shown going halfway about the black hole as a green wash) would all pile up in the same place from your perspective, forming a bright and quite thin ring. This is the photon ring for a non-spinning black hole — the full set of indirect images of everything that lies at or inside the photon sphere but outside the black hole horizon.

[Gralla et al. call the first indirect image the `lensed ring’ and the remaining indirect images, completely unobservable at EHT, the `photon ring’. I don’t know if their notation will be adopted but you might hear `lensed ring’ referred to in future. In any case, what EHT calls the photon ring includes what Gralla et al. call the lensed ring.]

So the conclusion is that if we had a perfect camera, the direct image of a disk makes a disk, but the indirect images (mainly just the first one, as Gralla et al. emphasize) make a bright, thin ring that may be superposed upon the direct image of the disk, depending on the disk’s shape.

And this conclusion, with some important adjustments, applies also for a spinning black hole viewed from above its north or south pole — its axis of rotation — or from near that axis; I’ll mention the adjustments in a moment.

But EHT is not a perfect camera. To make the black hole image, it had to be pushed to its absolute limits.  Someday we’ll see both the disk and the ring, but right now, they’re all blurred together.  So which one is more important?

From a Blurry Image to Blurry Knowledge

What does a blurry camera do to this simple image? You might think that the disk is so dim that the camera will mainly show you a blurry image of the bight photon ring. But that’s wrong. The ring isn’t bright enough. A simple calculation reveals that blurring the ring makes it dimmer than the disk! The photo, therefore, will show mainly the accretion disk, not the photon ring! This is shown in Figure 9, which you can compare with the Black Hole `photo’ (Figure 10).  (Figure 9 is symmetric around the ring, but the photo is not, for multiple reasons — rotation, viewpoint off the rotation axis, etc. — which I’ll have to defer til another post.)

More precisely, the ring and disk blur together, but the image is dominated by the disk.

Let’s say that again: the black hole `photo’ is likely showing the accretion disk, with the photon ring contributing only some of the light, and therefore the photon ring does not completely and unambiguously determine the radius of the observed dark patch in the `photo​.’  In general, the patch may well be smaller than what is usually termed the `shadow’ of the black hole.

This is very important. The photon ring’s radius barely depend on the rotation rate of the black hole, and therefore, if the light were coming from the ring, you’d know (without knowing the black hole’s rotation right) how big its dark patch will appear for a given mass. You could therefore use the radius of the ring in the photo to determine the black hole’s mass. But the accretion disk’s properties and appearance can vary much more. Depending on the spin of the black hole and the details of the matter that’s spiraling in to the black hole, its radius can be larger or smaller than the photon ring’s radius… making the measurement of the mass both more ambiguous and — if you partially mistook the accretion disk for the photon ring — potentially suspect. Hence: controversy. Is it possible that EHT underestimated their uncertainties, and that their measurement of the black hole mass has more ambiguities, and is not as precise, as they currently claim?

Here’s where the rotation rate is important.  For a non-rotating black hole the accretion disk’s inner edge is expected to lie outside the photon ring, but for a fast-rotating black hole (as M87’s may well be), it will lie inside the photon ring. And if that is true, the dark patch in the EHT image may not be the black hole’s full shadow (i.e. quasi-silhouette). It may be just the inner portion of it, with the outer portion obscured by emission from the accretion disk.

Gralla et al. subtly raise these questions but are careful not to overstate their case, because they have not yet completed their study of rotating black holes. But the question is now in the air. I’m interested to hear what the EHT folks have to say about it, as I’m sure they have detailed arguments in favor of their procedures.

(Rather than the accretion disk, it is also possible that the dominant emission comes from the inner portion of one of the jets that emerges from the vicinity of the black hole. This is another detail that makes the situation more difficult to interpret, but doesn’t change the main point I’m making.)

Why the Gargantua Black Hole From Interstellar is Completely Different

Just as a quick aside, what would you see if an accretion disk were edge-on rather than face-on? Then, in a perfect camera, you’d see something like the famous picture of Gargantua, the black hole from the movie Interstellar — a direct image of the front edge of the disk, and a strongly lensed indirect image of the back side of the disk, appearing both above and below the black hole, as illustrated in Figure 11.

One thing that isn’t included in the Gargantua image from the movie (Figure 12) is a sort of Doppler effect (which I’ll explain someday when I understand it 100%). This makes the part of the disk that is rotating toward us bright, and the part rotating away from us dim… and so the image will be very asymmetric, unlike the movie image. See Figure 13 with what it would really `look’ like to the EHT.

I mention this because a number of expert science journalists incorrectly explained the M87 image by referring to Gargantua — but that image has essentially nothing to do with the recent black hole `photo’. M87’s accretion disk is certainly not edge-on. The movie’s Gargantua image is taken from the equator, not from near the pole, and does not show the Doppler effect correctly (for artistic reasons).

Where a Rotating Black Hole Differs

Before I quit for the day, I’ll just summarize a few big differences for fast-rotating black holes compared to non-rotating ones.

1) What a rotating black hole looks like to a distant observe depends not only on where the matter around the black hole is located but also on how the black hole’s rotation axis is oriented relative to the observer. A north-pole observer, an equatorial observer, and a near-north-pole observer see quite different things. (We are apparently near-south-pole observers for M87’s black hole.)

Let’s assume that the accretion disk lies in the same plane as the black hole’s equator — there are reasons to expect this. Even then, the story is complex.

2) Instead of a photon-sphere, there is what you might call a `photon-zone’ — a region where specially aimed photons can travel round the black hole multiple times. As I mentioned above, for high-enough spin (greater than about 80% of maximum as I recall), an accretion disk’s inner edge can lie within the photon zone, or even closer to the black hole than the photon zone; this leads to multiple indirect images of the disk and a potentially bright photon ring.

3) However, depending on the viewing angle, the indirect images of the disk that form the photon ring may not be a circle, and may not be concentric with the direct image of the disk. Only when viewed from points along the rotation axis (i.e., above the north or south pole) will the direct and indirect image of the disk both be circular and concentric. That further complicates interpretation of the blurry image.

4) When the viewing angle is not along the rotation axis the image will be asymmetric, brighter on one side than the other. (This is true of EHT’s `photo’.) However, I know of at least four potential causes of this asymmetry, any or all of which might play a role, and the degree of asymmetry depends on properties of the accretion disk and the rotation rate of the black hole, both of which are currently unknown. Claims about the asymmetry made by the EHT folks seem, at least to me, to be based on certain assumptions that we cannot currently check.

Each of these complexities is a challenge to explain, so I’ll give both you and I a substantial break while I figure out how best to convey what is known (at least to me) about these issues.

by Matt Strassler at June 10, 2019 04:04 AM

June 05, 2019

Clifford V. Johnson - Asymptotia

News from the Front, XVII: Super-Entropic Instability

I'm quite excited because of some new results I got recently, which appeared on the ArXiv today. I've found a new (and I think, possibly important) instability in quantum gravity.

Said more carefully, I've found a sibling to Hawking's celebrated instability that manifests itself as black hole evaporation. This new instability also results in evaporation, driven by Hawking radiation, and it can appear for black holes that might not seem unstable to evaporation in ordinary circumstances (i.e., there's no Hawking channel to decay), but turn out to be unstable upon closer examination, in a larger context. That context is the extended gravitational thermodynamics you've read me talking about here in several previous posts (see e.g. here and here). In that framework, the cosmological constant is dynamical and enters the thermodynamics as a pressure variable, p. It has a conjugate, V, which is a quantity that can be derived once you know the pressure and the mass of the black hole.

Well, Hawking evaporation is a catastrophic quantum phenomenon that follows from the fact that the radiation temperature of a Schwarzschild black hole (the simplest one you can think of) goes inversely with the mass. So the black hole radiates and loses energy, reducing its mass. But that means that it will radiate at even higher temperature, driving its mass down even more. So it will radiate even more, and so on. So it is an instability in the sense that the system drives itself even further away from where it started at every moment. Like a pencil falling over from balancing on a point.

This is the original quantum instability for gravitational systems. It's, as you probably know, very important. (Although in our universe, the temperature of radiation is so tiny for astrophysical black holes (they have large mass) that the effect is washed out by the local temperature of the universe... But if the univverse ever had microscopic black holes, they'd have radiated in this way...)

So very nice, so very 1970s. What have I found recently?

A nice way of expressing the above instability is to simply say [...] Click to continue reading this post

The post News from the Front, XVII: Super-Entropic Instability appeared first on Asymptotia.

by Clifford at June 05, 2019 02:11 AM

May 27, 2019

Cormac O’Raifeartaigh - Antimatter (Life in a puzzling universe)

A conference in Paris

This week I’m in Paris, attending a conference in memory of the outstanding British astronomer and theoretician Arthur Stanley Eddington. The conference, which is taking place at the Observatoire de Paris, is designed to celebrate the centenary of Eddington’s famous measurement of the bending of distant starlight by the sun.  a key experiment that offered important early support for Einstein’s general theory of relativity. However, there are talks on lots of different topics, from Eddington’s philosophy of science to his work on the physics of stars, from his work in cosmology to his search for a unified field theory. The conference website and programme is here.

IMG_2761

The view from my hotel in Denfert-Rochereau

All of the sessions of the conference were excellent, but today was a particular treat with four outstanding talks on the 1919 expedition. In ‘Eddington, Dyson and the Eclipse of 1919’, Daniel Kennefick of the University of Arkansas gave a superb overview of his recent book on the subject. In ‘The 1919 May 29 Eclipse: On Accuracy and Precision’, David Valls-Gabaud of the Observatoire de Paris gave a forensic analysis of Eddington’s calculations. In ‘The 1919 Eclipse; Were the Results Robust?’ Gerry Gilmore of the University of Cambridge described how recent reconstructions of the expedition measurements gave confidence in the results; and in ‘Chasing Mare’s Nests ; Eddington and the Early Reception of General Relativity among Astronomers’, Jeffrey Crelinsten of the University of Toronto summarized the doubts expressed by major American astronomical groups in the early 1920s, as described in his excellent book.

Image result for no shadow of a doubt by daniel kennefick        Image result for einstein's jury

I won’t describe the other sessions, but just note a few things that made this conference the sort of meeting I like best. All speakers were allocated the same speaking time (30 mins including questions); most speakers were familiar with each other’s work; many speakers spoke on the same topic, giving different perspectives; there was plenty of time for further questions and comments at the end of each day. So a superb conference organised by Florian Laguens of the IPC and David Valls-Gabaud of the Observatoire de Paris.

IMG_2742

On the way to the conference

In my own case, I gave a talk on Eddington’s role in the discovery of the expanding universe. I have long been puzzled by the fact that Eddington, an outstanding astronomer and strong proponent of the general theory of relativity, paid no attention when his brilliant former student Georges Lemaître suggested that a universe of expanding universe could be derived from general relativity, a phenomenon that could account for the redshifts of the spiral nebulae, the biggest astronomical puzzle of the age. After considering some standard explanations (Lemaître’s status as an early-career researcher, the journal he chose to publish in and the language of the paper), I added two considerations of my own: (i) the theoretical analysis in Lemaître’s 1927 paper would have been very demanding for a 1927 reader and (ii) the astronomical data that Lemaître relied upon were quite preliminary (Lemaître’s calculation of a redshift/distance coefficient for the nebulae relied upon astronomical distances from Hubble that were established using the method of apparent magnitude, a method that was much less reliable than Hubble’s later observations using the method of Cepheid variables).

IMG_2759

Making my points at the Eddington Conference

It’s an interesting puzzle because it is thought that Lemaitre sent a copy of his paper to Eddington in 1927 – however I finished by admitting that there is a distinct possibility that Eddington simply didn’t take the time to read his former student’s paper. Sometimes the most boring explanation is the right one! The slides for my talk can be found here.

All in all, a superb conference.

 

by cormac at May 27, 2019 07:39 PM

May 24, 2019

Clifford V. Johnson - Asymptotia

News from the Front, XVI: Toward Quantum Heat Engines

(The following post is a bit more technical than usual. But non-experts may still find parts helpful.)

A couple of years ago I stumbled on an entire field that I had not encountered before: the study of Quantum Heat Engines. This sounds like an odd juxtaposition of terms since, as I say in the intro to my recent paper:

The thermodynamics of heat engines, refrigerators, and heat pumps is often thought to be firmly the domain of large classical systems, or put more carefully, systems that have a very large number of degrees of freedom such that thermal effects dominate over quantum effects. Nevertheless, there is thriving field devoted to the study—both experimental and theoretical—of the thermodynamics of machines that use small quantum systems as the working substance.

It is a fascinating field, with a lot of activity going on that connects to fields like quantum information, device physics, open quantum systems, condensed matter, etc.

Anyway, I stumbled on it because, as you may know, I've been thinking (in my 21st-meets-18th century way) about heat engines a lot over the last five years since I showed how to make them from (quantum) black holes, when embedded in extended gravitational thermodynamics. I've written it all down in blog posts before, so go look if interested (here and here).

In particular, it was when working on a project I wrote about here that I stumbled on quantum heat engines, and got thinking about their power and efficiency. It was while working on that project that I had a very happy thought: Could I show that holographic heat engines (the kind I make using black holes) -at least a class of them- are actually, in some regime, quantum heat engines? That would be potentially super-useful and, of course, super-fun.

The blunt headline statement is that they are, obviously, because every stage [...] Click to continue reading this post

The post News from the Front, XVI: Toward Quantum Heat Engines appeared first on Asymptotia.

by Clifford at May 24, 2019 05:16 PM

May 14, 2019

Axel Maas - Looking Inside the Standard Model

Acquiring a new field
I have recently started to look into a new field: Quantum gravity. In this entry, I would like to write a bit about how this happens, acquiring a new field. Such that you can get an idea what can lead a scientist to do such a thing. Of course, in future entries I will also write more about what I am doing, but it would be a bit early to do so right now.

Acquiring a new field in science is not something done lightly. One has always not enough time for the things one does already. And when you enter a new field, stuff is slow. You have to learn a lot of basics, need to get an overview of what has been done, and what is still open. Not to mention that you have to get used to a different jargon. Thus, one rarely does so lightly.

I have in the past written already one entry about how I came to do Higgs physics. This entry was written after the fact. I was looking back, and discussed my motivation how I saw it at that time. It will be an interesting thing to look back at this entry in a few years, and judge what is left of my original motivation. And how I feel about this knowing what happened since then. But for now, I only know the present. So, lets get to it.

Quantum gravity is the hypothetical quantum version of the ordinary theory of gravity, so-called general relativity. However, it has withstood quantization for a quite a while, though there has been huge progress in the last 25 years or so. If we could quantize it, its combination with the standard model and the simplest version of dark matter would likely be able to explain almost everything we can observe. Though even then a few open questions appear to remain.

But my interest in quantum gravity comes not from the promise of such a possibility. It has rather a quite different motivation. My interest started with the Higgs.

I have written many times that we work on an improvement in the way we look at the Higgs. And, by now, in fact of the standard model. In what we get, we see a clear distinction between two concepts: So-called gauge symmetries and global symmetries. As far as we understand the standard model, it appears that global symmetries determine how many particles of a certain type exists, and into which particles they can decay or be combined. Gauge symmetries, however, seem to be just auxiliary symmetries, which we use to make calculations feasible, and they do not have a direct impact on observations. They have, of course, an indirect impact. After all, in which theory which gauge symmetry can be used to facilitate things is different, and thus the kind of gauge symmetry is more a statement about which theory we work on.

Now, if you add gravity, the distinction between both appears to blur. The reason is that in gravity space itself is different. Especially, you can deform space. Now, the original distinction of global symmetries and gauge symmetries is their relation to space. A global symmetry is something which is the same from point to point. A gauge symmetry allows changes from point to point. Loosely speaking, of course.

In gravity, space is no longer fixed. It can itself be deformed from point to point. But if space itself can be deformed, then nothing can stay the same from point to point. Does then the concept of global symmetry still make sense? Or does all symmetries become just 'like' local symmetries? Or is there still a distinction? And what about general relativity itself? In a particular sense, it can be seen as a theory with a gauge symmetry of space. Makes this everything which lives on space automatically a gauge symmetry? If we want to understand the results of what we did in the standard model, where there is no gravity, in the real world, where there is gravity, then this needs to be resolved. How? Well, my research will hopefully answer this question. But I cannot do it yet.

These questions were already for some time in the back of my mind. A few years, I actually do not know how many exactly. As quantum gravity pops up in particle physics occasionally, and I have contact with several people working on it, I was exposed to this again and again. I knew, eventually, I will need to address it, if nobody else does. So far, nobody did.

But why now? What prompted me to start now with it? As so often in science, it were other scientists.

Last year at the end of November/beginning of December, I took part in a conference in Vienna. I had been invited to talk about our research. The meeting has a quite wide scope, and also present were several people, who work on black holes and quantum physics. In this area, one goes, in a sense, halfway towards quantum gravity: One has quantum particles, but they life in a classical gravity theory, but with strong gravitational effects. Which is usually a black hole. In such a setup, the deformations of space are fixed. And also non-quantum black holes can swallow stuff. This combination appears to make the following thing: Global symmetries appear to become meaningless, because everything associated with them can vanish in the black hole. However, keeping space deformations fixed means that local symmetries are also fixed. So they appear to become real, instead of auxiliary. Thus, this seems to be quite opposite to our result. And this, and the people doing this kind of research, challenged my view of symmetries. In fact, in such a half-way case, this effect seems to be there.

However, in a full quantum gravity theory, the game changes. Then also space deformations become dynamical. At the same time, black holes need no longer to have the characteristic to swallow stuff forever, because they become dynamical, too. They develop. Thus, to answer what happens really requires full quantum gravity. And because of this situation, I decided to start to work actively on quantum gravity. Because I needed to answer whether our picture of symmetries survive, at least approximately, when there is quantum gravity. And to be able to answer such challenges. And so it began.

Within the last six months, I have now worked through a lot of the basic stuff. I have now a rough idea of what is going on, and what needs to be done. And I think, I see a way how everything can be reconciled, and make sense. It will still need a long time to complete this, but I am very optimistic right now. So optimistic, in fact, that a few days back I gave my first talk, in which I discussed this issues including quantum gravity. It will still need time, before I have a first real result. But I am quite happy how thing progress.

And that is the story how I started to look at quantum gravity in earnest. If you want to join me in this endeavor: I am always looking for collaboration partners and, of course, students who want to do their thesis work on this subject 😁

by Axel Maas (noreply@blogger.com) at May 14, 2019 03:03 PM

May 12, 2019

Marco Frasca - The Gauge Connection

Is it possible to get rid of exotic matter in warp drive?

On 1994, Miguel Alcubierre proposed a solution of the Einstein equations (see here) describing a space-time bubble moving at arbitrary speed. It is important to notice that no violation of the light speed limit happens because is the space-time moving and inside the bubble everything goes as expected. Miguel AlcubierreThis kind of solutions of the Einstein equations have a fundamental drawback: they violate Weak Energy Condition (WEC) and, in order to exist, some exotic matter with negative energy density must exist. Useless to say, nobody has ever seen such kind of matter. There seems to exist some clue in the way Casimir effect works but this just relies on the way one interprets quantum fields rather than an evidence of existence. Besides, since the initial proposal, a great number of studies have been published showing how pathological the Alcubierre’s solution can be, also recurring to quantum field theory (e.g. Hawking radiation). So, we have to turn to dream of a possible interstellar travel hoping that some smart guy will one day come out with a better solution.

Of course, Alcubierre’s solution is rather interesting from a physical point of view as it belongs to a number of older solutions, likeKip Thorne wormholes, time machines and like that, yielded by very famous authors as Kip Thorne, that arise when one impose a solution and then check the conditions of its existence. This turns out to be a determination of the energy-momentum tensor and, unavoidably, is negative. Then, they violate whatever energy condition of the Einstein equations granting pathological behaviour. On the other side, they appear the most palatable for science fiction of possible futures of space and time travels. In these times where this kind of technologies are largely employed by the film industry, moving the fantasy of millions, we would hope that such futures should also be possible.

It is interesting to note the procedure to obtain these particular solutions. One engineers it on a desk and then substitute them into the Einstein equations to see when are really a solution. One fixes in this way the energy requirements. On the other side, it is difficult to come out from the blue with a solution of the Einstein equations that provides such a particular behaviour, moving the other way around. It is also possible that such solutions are not possible and imply always a violation of the energy conditions. Some theorems have been proved in the course of time that seem to prohibit them (e.g. see here). Of course, I am convinced that the energy conditions must be respected if we want to have the physics that describes our universe. They cannot be evaded.

So, turning at the question of the title, could we think of a possible warp drive solution of the Einstein equations without exotic matter? The answer can be yes of course provided we are able to recover the York time, or warp factor, in the way Alcubierre obtained it with its pathological solution. At first, this seems an impossible mission. But the space-time bubble we are considering is a very small perturbation and perturbation theory can come to rescue. Particularly, when this perturbation can be locally very strong. On 2005, I proposed such a solution (see here) together with a technique to solve the Einstein equations when the metric is strongly perturbed. My intent at that time was to give a proof of the BKL conjecture. A smart referee suggested to me to give an example of application of the method. The metric I have obtained in this way, perturbing a Schwarzaschild metric, yields a solution that has an identical York time (warp factor) as for the Alcubierre’s metric. Of course, I am respecting energy conditions as I am directly solving the Einstein equations that do.

The identity between the York times can be obtained provided the form factor proposed by Alcubierre is taken to be 1 but this is just the simplest case. Here is an animation of my warp factor.

Warp factor

It seen the bubble moving as expected along the x direction.

My personal hope is that this will go beyond a mathematical curiosity. On the other side, it should be understood how to provide such kind of perturbations to a given metric. I can think to the Einstein-Maxwell equations solved using perturbation theory. There is a lot of literature about and a lot of great contributions on this argument.

Finally, this could give a meaning to the following video by NASA.

by mfrasca at May 12, 2019 05:59 PM

May 04, 2019

Clifford V. Johnson - Asymptotia

Endgame Memories

About 2-3 (ish) years ago, I was asked to visit the Disney/Marvel mothership in Burbank for a meeting. I was ushered into the inner workings of the MCU, past a statue of the newly acquired Spidey, and into a room. Present were Christopher Markus and Stephen McFeely, the writers of … Click to continue reading this post

The post Endgame Memories appeared first on Asymptotia.

by Clifford at May 04, 2019 06:34 PM

April 30, 2019

Cormac O’Raifeartaigh - Antimatter (Life in a puzzling universe)

A Week at The Surf Experience

I don’t often take a sun holiday these days, but I had a fabulous time last week at The Surf Experience in Lagos, Portugal. I’m not an accomplished surfer by any measure, but there is nothing quite like the thrill of catching a few waves in the sea with the sun overhead – a nice change from the indoors world of academia.

Not for the first time, I signed up for a residential course with The Surf Experience in Lagos. Founded by veteran German surfer Dago Lipke, guests of The Surf Experience stay at the surf lodge Vila Catarina, a lovely villa in the hills above Lagos, complete with beautiful gardens and swimming pool. Sumptuous meals are provided by Dagos’s wife Connie, a wonderful cook. Instead of wandering around town trying to find a different restaurant every evening, guests enjoy an excellent meal in a quiet setting in good company, followed by a game of pool or chess. And it really is good company. Guests at TSE tend mainly to hail from Germany and Switzerland, with a sprinkling from France and Sweden, so it’s truly international – quite a contrast to your average package tour (or indeed our college staff room). Not a mention of Brexit, and an excellent opportunity to improve my German. (Is that what you tell yourself?- Ed)

IMG_2637 (1)

Hanging out at the pool before breakfast

IMG_2634

Fine dining at The Surf Experience

IMG_2624

A game of cards and a conversation instead of a noisy bar

Of course, no holiday is perfect and in this case I managed to pick up an injury on the first day. Riding the tiniest wave all the way back to the beach, I got unexpectedly thrown off, hitting my head off the bottom at speed. (This is the most elementary error you can make in surfing and it risks serious injury, from concussion to spinal fracture). Luckily, I walked away with nothing more than severe bruising to the neck and chest (as later established by X-ray at the local medical clinic, also an interesting experience). So no life-altering injuries, but like a jockey with a broken rib, I was too sore to get back on the horse for few days. Instead, I tried Stand Up Paddling for the first time, which I thoroughly enjoyed. It’s more exciting than it looks, must get my own board for calm days at home.

E6Jc2LvY

Stand Up Paddling in Lagos with Kiteschool Portugal

Things got even better towards the end of the week as I began to heal. Indeed, the entire surf lodge had a superb day’s surfing yesterday on beautiful small green waves at a beach right next to town (in Ireland, we very rarely see clean conditions like this, the surf is mainly driven by wind). It was fantastic to catch wave after wave throughout the afternoon, even if clambering back on the board after each wasn’t much fun for yours truly.

This morning, I caught a Ryanair flight back to Dublin from Faro, should be back in the office by late afternoon. Oddly enough, I feel enormously refreshed – perhaps it’s the feeling of gradually healing. Hopefully the sensation of being continuously kicked in the ribs will disappear soon and I’ll be back on the waves in June. In the meantime, this week marks a study period for our students before their exams, so it’s an ideal time to prepare my slides for the Eddington conference in Paris later this month.

Update

I caught a slight cold on the way back, so today I’m wandering around college like a lunatic going cough, ‘ouch’ , sneeze, ‘ouch’.  Maybe it’s karma for flying Ryanair – whatever about indulging in one or two flights a year, it’s a terrible thing to use an airline whose CEO continues to openly deny the findings of climate scientists.

 

by cormac at April 30, 2019 09:49 PM

April 24, 2019

Andrew Jaffe - Leaves on the Line

Spring Break?

Somehow I’ve managed to forget my usual end-of-term post-mortem of the year’s lecturing. I think perhaps I’m only now recovering from 11 weeks of lectures, lab supervision, tutoring alongside a very busy time analysing Planck satellite data.

But a few weeks ago term ended, and I finished teaching my undergraduate cosmology course at Imperial, 27 lectures covering 14 billion years of physics. It was my fourth time teaching the class (I’ve talked about my experiences in previous years here, here, and here), but this will be the last time during this run. Our department doesn’t let us teach a course more than three or four years in a row, and I think that’s a wise policy. I think I’ve arrived at some very good ways of explaining concepts such as the curvature of space-time itself, and difficulties with our models like the 122-or-so-order-of-magnitude cosmological constant problem, but I also noticed that I wasn’t quite as excited as in previous years, working up from the experimentation of my first time through in 2009, putting it all on a firmer foundation — and writing up the lecture notes — in 2010, and refined over the last two years. This year’s teaching evaluations should come through soon, so I’ll have some feedback, and there are still about six weeks until the students’ understanding — and my explanations — are tested in the exam.

Next year, I’ve got the frankly daunting responsibility of teaching second-year quantum mechanics: 30 lectures, lots of problem sheets, in-class problems to work through, and of course the mindbending weirdness of the subject itself. I’d love to teach them Dirac’s very useful notation which unifies the physical concept of quantum states with the mathematical ideas of vectors, matrices and operators — and which is used by all actual practitioners from advanced undergraduates through working physicists. But I’m told that students find this an extra challenge rather than a simplification. Comments from teachers and students of quantum mechanics are welcome.

by Andrew at April 24, 2019 01:19 AM

April 23, 2019

Georg von Hippel - Life on the lattice

Looking for guest blogger(s) to cover LATTICE 2018
Since I will not be attending LATTICE 2018 for some excellent personal reasons, I am looking for a guest blogger or even better several guest bloggers from the lattice community who would be interested in covering the conference. Especially for advanced PhD students or junior postdocs, this might be a great opportunity to get your name some visibility. If you are interested, drop me a line either in the comment section or by email (my university address is easy to find).

by Georg v. Hippel (noreply@blogger.com) at April 23, 2019 01:18 PM

April 16, 2019

Matt Strassler - Of Particular Significance

The Black Hole `Photo’: Seeing More Clearly

THIS POST CONTAINS ERRORS CONCERNING THE EXISTENCE AND VISIBILITY OF THE SO-CALLED PHOTON-SPHERE AND SHADOW; THESE ERRORS WERE COMMON TO ESSENTIALLY ALL REPORTING ON THE BLACK HOLE ‘PHOTO’.  IT HAS BEEN SUPERSEDED BY THIS POST, WHICH CORRECTS THESE ERRORS AND EXPLAINS THE SITUATION.

Ok, after yesterday’s post, in which I told you what I still didn’t understand about the Event Horizon Telescope (EHT) black hole image (see also the pre-photo blog post in which I explained pedagogically what the image was likely to show and why), today I can tell you that quite a few of the gaps in my understanding are filling in (thanks mainly to conversations with Harvard postdoc Alex Lupsasca and science journalist Davide Castelvecchi, and to direct answers from professor Heino Falcke, who leads the Event Horizon Telescope Science Council and co-wrote a founding paper in this subject).  And I can give you an update to yesterday’s very tentative figure.

First: a very important point, to which I will return in a future post, is that as I suspected, it’s not at all clear what the EHT image really shows.   More precisely, assuming Einstein’s theory of gravity is correct in this context:

  • The image itself clearly shows a black hole’s quasi-silhouette (called a `shadow’ in expert jargon) and its bright photon-sphere where photons [particles of light — of all electromagnetic waves, including radio waves] can be gathered and focused.
  • However, all the light (including the observed radio waves) coming from the photon-sphere was emitted from material well outside the photon-sphere; and the image itself does not tell you where that material is located.  (To quote Falcke: this is `a blessing and a curse’; insensitivity to the illumination source makes it easy to interpret the black hole’s role in the image but hard to learn much about the material near the black hole.) It’s a bit analogous to seeing a brightly shining metal ball while not being able to see what it’s being lit by… except that the photon-sphere isn’t an object.  It’s just a result of the play of the light [well, radio waves] directed by the bending effects of gravity.  More on that in a future post.
  • When you see a picture of an accretion disk and jets drawn to illustrate where the radio waves may come from, keep in mind that it involves additional assumptions — educated assumptions that combine many other measurements of M87’s black hole with simulations of matter, gravity and magnetic fields interacting near a black hole.  But we should be cautious: perhaps not all the assumptions are right.  The image shows no conflicts with those assumptions, but neither does it confirm them on its own.

Just to indicate the importance of these assumptions, let me highlight a remark made at the press conference that the black hole is rotating quickly, clockwise from our perspective.  But (as the EHT papers state) if one doesn’t make some of the above-mentioned assumptions, one cannot conclude from the image alone that the black hole is actually rotating.  The interplay of these assumptions is something I’m still trying to get straight.

Second, if you buy all the assumptions, then the picture I drew in yesterday’s post is mostly correct except (a) the jets are far too narrow, and shown overly disconnected from the disk, and (b) they are slightly mis-oriented relative to the orientation of the image.  Below is an improved version of this picture, probably still not the final one.  The new features: the jets (now pointing in the right directions relative to the photo) are fatter and not entirely disconnected from the accretion disk.  This is important because the dominant source of illumination of the photon-sphere might come from the region where the disk and jets meet.

My3rdGuessBHPhoto.png

Updated version of yesterday’s figure: main changes are the increased width and more accurate orientation of the jets.  Working backwards: the EHT image (lower right) is interpreted, using mainly Einstein’s theory of gravity, as (upper right) a thin photon-sphere of focused light surrounding a dark patch created by the gravity of the black hole, with a little bit of additional illumination from somewhere.  The dark patch is 2.5 – 5 times larger than the event horizon of the black hole, depending on how fast the black hole is rotating; but the image itself does not tell you how the photon-sphere is illuminated or whether the black hole is rotating.  Using further assumptions, based on previous measurements of various types and computer simulations of material, gravity and magnetic fields, a picture of the black hole’s vicinity (upper left) can be inferred by the experts. It consists of a fat but tenuous accretion disk of material, almost face-on, some of which is funneled into jets, one heading almost toward us, the other in the opposite direction.  The material surrounds but is somewhat separated from a rotating black hole’s event horizon.  At this radio frequency, the jets and disk are too dim in radio waves to see in the image; only at (and perhaps close to) the photon-sphere, where some of the radio waves are collected and focused, are they bright enough to be easily discerned by the Event Horizon Telescope.

 

by Matt Strassler at April 16, 2019 12:53 PM

April 06, 2019

Andrew Jaffe - Leaves on the Line

@TheMekons make the world alright, briefly, at the 100 Club, London.

by Andrew at April 06, 2019 10:17 AM

March 31, 2019

Cormac O’Raifeartaigh - Antimatter (Life in a puzzling universe)

My favourite conference; the Institute of Physics Spring Weekend

This weekend I attended the annual meeting of the Institute of Physics in Ireland. I always enjoy these meetings – more relaxing than a technical conference and a great way of keeping in touch with physicists from all over the country. As ever, there were a number of interesting presentations, plenty of discussions of science and philosophy over breakfast, lunch and dinner, all topped off by the annual awarding of the Rosse Medal, a highly competitive competition for physics postgraduates across the nation.

banner

The theme of this year’s meeting was ‘A Climate of Change’ and thus the programme included several talks on the highly topical subject of anthropogenic climate change. First up was ‘The science of climate change’, a cracking talk on the basic physics of climate change by Professor Joanna Haigh of Imperial College London. This was followed by ‘Climate change: where we are post the IPCC report and COP24’, an excellent presentation by Professor John Sweeney of Maynooth University on the latest results from the IPCC. Then it was my turn. In ‘Climate science in the media – a war on information?’,  I compared the coverage of climate change in the media with that of other scientific topics such as medical science and and big bang cosmology. My conclusion was that climate change is a difficult subject to convey to the public, and matters are not helped by actors who deliberately attempt to muddle the science and downplay the threat. You can find details of the full conference programme here and the slides for my own talk are here.

 

Images of my talk from IoP Ireland 

There followed by a panel discussion in which Professor Haigh, Professor Sweeney and I answered questions from the floor on climate science. I don’t always enjoy panel discussions, but I think this one was useful thanks to some excellent chairing by Paul Hardaker of the Institute of Physics.

IMG_2504 (1)

Panel discussion of the threat of anthopogenic climate change

After lunch, we were treated to a truly fascinating seminar: ‘Tropical storms, hurricanes, or just a very windy day?: Making environmental science accessible through Irish Sign Language’, by Dr Elizabeth Mathews of Dublin City University, on the challenge of making media descriptions of threats such as storms hurricanes and climate change accessible to deaf people. This was followed by a most informative talk by Dr Bajram Zeqiri of the National Physical Laboratory on the recent redefinition of the kilogram,  ‘The measure of all things: redefinition of the kilogram, the kelvin, the ampere and the mole’.

Finally, we had the hardest part of the day, the business of trying to select the best postgraduate posters and choosing a winner from the shortlist. As usual, I was blown away by the standard, far ahead of anything I or my colleagues ever produced. In the end, the Rosse Medal was awarded to Sarah Markham of the University of Limerick for a truly impressive poster and presentation.

D25jKhvXcAE8vdA

Viewing posters at the IoP 2019 meeting; image courtesy of IoP Ireland

All in all, another super IoP Spring weekend. Now it’s back to earth and back to teaching…

by cormac at March 31, 2019 08:51 PM

March 29, 2019

Robert Helling - atdotde

Proving the Periodic Table
The year 2019 is the International Year of the Periodic Table celebrating the 150th anniversary of Mendeleev's discovery. This prompts me to report on something that I learned in recent years when co-teaching "Mathematical Quantum Mechanics" with mathematicians in particular with Heinz Siedentop: We know less about the mathematics of the periodic table) than I thought.



In high school chemistry you learned that the periodic table comes about because of the orbitals in atoms. There is Hundt's rule that tells you the order in which you have to fill the shells in and in them the orbitals (s, p, d, f, ...). Then, in your second semester in university, you learn to derive those using Sehr\"odinger's equation: You diagonalise the Hamiltonian of the hyrdrogen atom and find the shells in terms of the main quantum number $n$ and the orbitals in terms of the angular momentum quantum number $L$ as $L=0$ corresponds to s, $L=1$ to p and so on. And you fill the orbitals thanks to the Pauli excursion principle. So, this proves the story of the chemists.

Except that it doesn't: This is only true for the hydrogen atom. But the Hamiltonian for an atom nuclear charge $Z$ and $N$ electrons (so we allow for ions) is (in convenient units)
$$ a^2+b^2=c^2$$

$$ H = -\sum_{i=1}^N \Delta_i -\sum_{i=1}^N \frac{Z}{|x_i|} + \sum_{i\lt j}^N\frac{1}{|x_i-x_j|}.$$

The story of the previous paragraph would be true if the last term, the Coulomb interaction between the electrons would not be there. In that case, there is no interaction between the electrons and we could solve a hydrogen type problem for each electron separately and then anti-symmetrise wave functions in the end in a Slater determinant to take into account their Fermionic nature. But of course, in the real world, the Coulomb interaction is there and it contributes like $N^2$ to the energy, so it is of the same order (for almost neutral atoms) like the $ZN$ of the electron-nucleon potential.

The approximation of dropping the electron-electron Coulomb interaction is well known in condensed matter systems where there resulting theory is known as a "Fermi gas". There it gives you band structure (which is then used to explain how a transistor works)


Band structure in a NPN-transistor
Also in that case, you pretend there is only one electron in the world that feels the periodic electric potential created by the nuclei and all the other electrons which don't show up anymore in the wave function but only as charge density.

For atoms you could try to make a similar story by taking the inner electrons into account by saying that the most important effect of the ee-Coulomb interaction is to shield the potential of the nucleus thereby making the effective $Z$ for the outer electrons smaller. This picture would of course be true if there were no correlations between the electrons and all the inner electrons are spherically symmetric in their distribution around the nucleus and much closer to the nucleus than the outer ones.  But this sounds more like a day dream than a controlled approximation.

In the condensed matter situation, the standing for the Fermi gas is much better as there you could invoke renormalisation group arguments as the conductivities you are interested in are long wave length compared to the lattice structure, so we are in the infra red limit and the Coulomb interaction is indeed an irrelevant term in more than one euclidean dimension (and yes, in 1D, the Fermi gas is not the whole story, there is the Luttinger liquid as well).

But for atoms, I don't see how you would invoke such RG arguments.

So what can you do (with regards to actually proving the periodic table)? In our class, we teach how Lieb and Simons showed that in the $N=Z\to \infty$ limit (which in some sense can also be viewed as the semi-classical limit when you bring in $\hbar$ again) that the ground state energy $E^Q$ of the Hamiltonian above is in fact approximated by the ground state energy $E^{TF}$ of the Thomas-Fermi model (the simplest of all density functional theories, where instead of the multi-particle wave function you only use the one-particle electronic density $\rho(x)$ and approximate the kinetic energy by a term like $\int \rho^{5/3}$ which is exact for the three fermi gas in empty space):

$$E^Q(Z) = E^{TF}(Z) + O(Z^2)$$

where by a simple scaling argument $E^{TF}(Z) \sim Z^{7/3}$. More recently, people have computed more terms in these asymptotic which goes in terms of $Z^{-1/3}$, the second term ($O(Z^{6/3})= O(Z^2)$ is known and people have put a lot of effort into $O(Z^{5/3})$ but it should be clear that this technology is still very very far from proving anything "periodic" which would be $O(Z^0)$. So don't hold your breath hoping to find the periodic table from this approach.

On the other hand, chemistry of the periodic table (where the column is supposed to predict chemical properties of the atom expressed in terms of the orbitals of the "valence electrons") works best for small atoms. So, another sensible limit appears to be to keep $N$ small and fixed and only send $Z\to\infty$. Of course this is not really describing atoms but rather highly charged ions.

The advantage of this approach is that in the above Hamiltonian, you can absorb the $Z$ of the electron-nucleon interaction into a rescaling of $x$ which then let's $Z$ reappear in front of the electron-electron term as $1/Z$. Then in this limit, one can try to treat the ugly unwanted ee-term perturbatively.

Friesecke (from TUM) and collaborators have made impressive progress in this direction and in this limit they could confirm that for $N < 10$ the chemists' picture is actually correct (with some small corrections). There are very nice slides of a seminar talk by Friesecke on these results.

Of course, as a practitioner, this will not surprise you (after all, chemistry works) but it is nice to know that mathematicians can actually prove things in this direction. But it there is still some way to go even 150 years after Mendeleev.

by Unknown (noreply@blogger.com) at March 29, 2019 11:02 AM

March 21, 2019

Alexey Petrov - Symmetry factor

CP-violation in charm observed at CERN

 

There is a big news that came from CERN today. It was announced at a conference called Recontres de Moriond, one of the major yearly conferences in the field of particle physics. One of the CERN’s experiments, LHCb, reported an observation — yes, observation, not an evidence for, but an observation, of CP-violation in charmed system. Why is it big news and why should you care?

You should care about this announcement because it has something to do with how our Universe looks like. As you look around, you might notice an interesting fact: everything is made of matter. So what about it? Well, one thing is missing from our everyday life: antimatter.

As it turns out, physicists believe that the amount of matter and antimatter was the same after the Universe was created. So, the $1,110,000 question is: what happened to antimatter? According to Sakharov’s criteria for baryonogenesis (a process of creating  more baryons, like protons and neutrons, than anti-baryons), one of the conditions for our Universe to be the way it is would be to have matter particles interact slightly differently from the corresponding antimatter particles. In particle physics this condition is called CP-violation. It has been observed in beauty and strange quarks, but never in charm quarks. As charm quarks are fundamentally different from both beauty and strange ones (electrical charge, mass, ways they interact, etc.), physicists hoped that New Physics, something that we have not yet seen or predicted, might be lurking nearby and can be revealed in charm decays. That is why so much attention has been paid to searches for CP-violation in charm.

Now there are indications that the search is finally over: LHCb announced that they observed CP-violation in charm. Here is their announcement (look for a news item from 21 March 2019). A technical paper can be found here, discussing how LHCb extracted CP-violating observables from time-dependent analysis of D -> KK and D-> pipi decays.

The result is generally consistent with the Standard Model expectations. However, there are theory papers (like this one) that predict the Standard Model result to be about seven times smaller with rather small uncertainty.  There are three possible interesting outcomes:

  1. Experimental result is correct but the theoretical prediction mentioned above is not. Well, theoretical calculations in charm physics are hard and often unreliable, so that theory paper underestimated the result and its uncertainties.
  2. Experimental result is incorrect but the theoretical prediction mentioned above is correct. Maybe LHCb underestimated their uncertainties?
  3. Experimental result is correct AND the theoretical prediction mentioned above is correct. This is the most interesting outcome: it implies that we see effects of New Physics.

What will it be? Time will show.

More technical note on why it is hard to see CP-violation in charm.

Once reason that CP-violating observables are hard to see in charm is because they are quite small, at least in the Standard Model.  All final/initial state quarks in the D -> KK or D -> pi pi transition belong to the first two generations. The CP-violating asymmetry that arises when we compare time-dependent decay rates of D0 to a pair of kaons or pions to the corresponding decays of anti-D0 particle can only happen if one reaches the weak phase taht is associated with the third generation of quarks (b and t), which is possible via penguin amplitude. The problem is that the penguin amplitude is small, as Glashow-Illiopulos -Maiani (GIM) mechanism makes it to be proportional to m_b^2 times tiny CKM factors. Strong phases needed for this asymmetry come from the tree-level decays and (supposedly) are largely non-perturbative.

Notice that in B-physics the situation is exactly the opposite. You get the weak phase from the tree-level amplitude and the penguin one is proportional to m_top^2, so CP-violating interference is large.

Ask me if you want to know more!

by apetrov at March 21, 2019 06:45 PM

March 16, 2019

Robert Helling - atdotde

Nebelkerze CDU-Vorschlag zu "keine Uploadfilter"
Sorry, this one of the occasional posts about German politics and thus in German. This is my posting to a German speaking mailing lists discussing the upcoming EU copyright directive (must be stopped in current from!!! March 23rd international protest day) and now the CDU party has proposed how to implement it in German law, although so unspecific that all the problematic details are left out. Here is the post.

Vielleicht bin ich zu doof, aber ich verstehe nicht, wo der genaue Fortschritt zu dem, was auf EU-Ebene diskutiert wird, sein soll. Ausser dass der CDU-Vorschlag so unkonkret ist, dass alle internen Widersprüche im Nebel verschwinden. Auch auf EU-Ebene sagen doch die Befuerworter, dass man viel lieber Lizenzen erwerben soll, als filtern. Das an sich ist nicht neu.

Neu, zumindest in diesem Handelsblatt-Artikel, aber sonst habe ich das nirgends gefunden, ist die Erwähnung von Hashsummen („digitaler Fingerabdruck“) oder soll das eher sowas wie ein digitales Wasserzeichen sein? Das wäre eine echte Neuerung, würde das ganze Verfahren aber sofort im Keim ersticken, da damit nur die Originaldatei geschützt wäre (das waere ja auch trivial festzustellen), aber jede Form des abgeleiteten Werkes komplett durch die Maschen fallen würde und man durch eine Trivialänderung Werke „befreien“ könnte. Ansonsten sind wir wieder bei den zweifelhaften, auf heute noch nicht existierender KI-Technologie beruhenden Filtern.

Das andere ist die Pauschallizenz. Ich müsste also nicht mehr mit allen Urhebern Verträge abschliessen, sondern nur noch mit der VG Internet. Da ist aber wieder die grosse Preisfrage, für wen die gelten soll. Intendiert sind natürlich wieder Youtube, Google und FB. Aber wie formuliert man das? Das ist ja auch der zentrale Stein des Anstoßes der EU-Direktive: Eine Pauschallizenz brauchen all, ausser sie sind nichtkommerziell (wer ist das schon), oder (jünger als drei Jahre und mit wenigen Benutzern und kleinem Umsatz) oder man ist Wikipedia oder man ist GitHub? Das waere wieder die „Internet ist wie Fernsehen - mit wenigen grossen Sendern und so - nur eben anders“-Sichtweise, wie sie von Leuten, die das Internet aus der Ferne betrachten so gerne propagiert wird. Weil sie eben alles andere praktisch platt macht. Was ist denn eben mit den Foren oder Fotohostern? Müssten die alle eine Pauschallizenz erwerben (die eben so hoch sein müsste, dass sie alle Film- und Musikrechte der ganzen Welt pauschal abdeckt)? Was verhindert, dass das am Ende ein „wer einen Dienst im Internet betreibt, der muss eben eine kostenpflichtige Internetlizenz erwerben, bevor er online gehen kann“-Gesetz wird, das bei jeder nichttrivialen Höhe der Lizenzgebühr das Ende jeder gras roots Innovation waere?

Interessant waere natuerlich auch, wie die Einnahmen der VG Internet verteilt werden. Ein Schelm waere, wenn das nicht in großen Teilen zB bei Presseverlegern landen würde. Das waere doch dann endlich das „nehmt denjenigen, die im Internet Geld verdienen dieses weg und gebt es und, die nicht mehr so viel Geld verdienen“-Gesetz. Dann müsste die Lizenzgebühr am besten ein Prozentsatz des Umsatz sein, am besten also eine Internet-Steuer.

Und ich fange nicht damit an, wozu das führt, wenn alle europäischen Länder so krass ihre eigene Umsetzungssuppe kochen.

Alles in allem ein ziemlich gelungener Coup der CDU, der es schaffen kann, den Kritikern von Artikel 13 in der öffentlichen Meinung den Wind aus den Segeln zu nehmen, indem man es alles in eine inkonkrete Nebelwolke packt, wobei die ganzen problematischen Regelungen in den Details liegen dürften.

by Unknown (noreply@blogger.com) at March 16, 2019 09:43 AM