Quantcast
  • Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.

News

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback

Attributions

(propose a free ad)

Site Statistics

145 submissions , 122 unreviewed
3,930 questions , 1,398 unanswered
4,862 answers , 20,637 comments
1,470 users with positive rep
502 active unimported users
More ...

What are the main algorithms the LHC particle detectors use to reconstruct decay pathways?

+ 6 like - 0 dislike
292 views

I am just starting to look into how we understand the data from particle collisions.

My question is, what are the algorithms or ways that these detectors interpret the data? Are there standard approaches? Or if not, what are some good papers or places to look to get started in learning more about the implementation and/or details of how this works?

So far I haven't dug into any textbooks, but many articles on the web and this was somewhat helpful in pointing to where to look:

http://arstechnica.com/science/2010/03/all-about-particle-smashers-part-ii/

So from my understanding so far, there are a few different LCH "experiments", which are physical structures that are optimized to focus on specific aspects of data from a collision event. The detector measures all kinds of particle emissions and changes in electrical fields, and then seems to try to backtrack and figure out all the emission/decay events that might have taken place in that split second.

From my understanding so far, basically the computer programs used to compute these possible "decay pathways" must be using some standard algorithms or something, and must have built into them all possible particle emission pathways (like all possible Feynman diagrams if there is such a thing).

Are there any good resources or standard algorithms/approaches to understanding how particle detectors analyze their data?

This post imported from StackExchange Physics at 2014-08-12 09:35 (UCT), posted by SE-user Lance Pollard
asked Aug 9, 2014 in Experimental Physics by Lance Pollard (30 points) [ no revision ]
retagged Aug 12, 2014
Most voted comments show all comments
i think cern has a page with references about algorithms and software used in various experiments, apart from that papers detailing the results of LHC experiments usually mention what kind of algorithms were used and what software (if any) (off the top of my head, papers about the higgs-like boson experiments had references about algorithms and software used)

This post imported from StackExchange Physics at 2014-08-12 09:35 (UCT), posted by SE-user Nikos M.
This is a huge topic. I literally took a semester course in grad school to get enough foundation to be ready to start learning when I took up research. and subsequently attended two different summer schools to learn a bit more. Sub-subjects include tracking, particle ID, jet identification, calorimetry and a huge body of work on beating the combinatorial explosion in the several places it rears its ugly head.

This post imported from StackExchange Physics at 2014-08-12 09:35 (UCT), posted by SE-user dmckee
@dmckee hardcore, sounds like a lot but still interesting. What was the name of the course so I can check out textbooks maybe then related to the topic?

This post imported from StackExchange Physics at 2014-08-12 09:35 (UCT), posted by SE-user Lance Pollard
It was a special class without a regular number and we mostly didn't use a text, but rather selected papers. We all had a copies of Perkins and of Leo from previous course work.

This post imported from StackExchange Physics at 2014-08-12 09:35 (UCT), posted by SE-user dmckee
You might be interested in this page. Basically the vast majority of the collision data is discarded immediately by custom hardware; the vast majority of what's left is discarded immediately by slower but more sophisticated software; and what passes those stages (a mere GB/s or so of data) is stored forever and analyzed later (probably multiple times, by different groups, for different purposes, using different techniques).

This post imported from StackExchange Physics at 2014-08-12 09:35 (UCT), posted by SE-user benrg
Most recent comments show all comments
Yeah that would be perfect! This is the closest I've found so far to some description of algorithms that might be being used (like the Monte-Carlo algorithm): quantumdiaries.org/2010/12/11/when-feynman-diagrams-fail.

This post imported from StackExchange Physics at 2014-08-12 09:35 (UCT), posted by SE-user Lance Pollard
In a sense the Monte Carlo (simulation) is not a reconstruction algorithm at all. It is a critical part of the analysis chain, but comes into play after you have figured out what kinds of things were going on in each event. I've written a about about how MCs are used elsewhere on the site. Here and here.

This post imported from StackExchange Physics at 2014-08-12 09:35 (UCT), posted by SE-user dmckee

2 Answers

+ 4 like - 0 dislike

The algorithms used are as many as the experimental setups times the detectors used in the setups. They are built to fit the detectors and not the other way around.

The common aspects are a few

1)charged particles interact with matter ionizing it and one builds detectors where the passage of an ionizing particle can be recorded. It can be a bubble chamber, a Time Projection Chamber, a vertex detector ( of which there exist various types).These are used in conjunction with strong magnetic fields and the bending of the tracks gives the momentum of the charged particle.

2)Neutral particles are either

a)photons, and the electromagnetic calorimeters measure them.

b) hadronic, i.e. interact with matter, and hadronic calorimeters are designed so as to measure the energy of these neutrals

c) weakly interacting, as neutrinos, which can only be detected by measuring all the energy and momenta in the event finding the missing energy and momentum.

In addition there are the muon detectors, charged tracks that go through meters of matter without interacting except electromagnetically and the outside detectors are designed to catch them.

cms

The complexity of the LHC detectors requires these enormous collaborations of 3000 people working on one goal : getting physics data out of the system. Algorithms are a necessary part of this chain and are made to order using the basic physics concepts that drive the detectors.

As Curiousone says in order to understand the algorithms entering in the data reduction from these detectors a lot of elbow grease is needed. Certainly they are custom made.

This post imported from StackExchange Physics at 2014-08-12 09:35 (UCT), posted by SE-user anna v
answered Aug 11, 2014 by anna v (1,785 points) [ no revision ]
+ 2 like - 0 dislike

Well, if you have the time... CERN has all the technical design reports for its detectors online at http://cds.cern.ch/. They are excellent reading material.

Start with a search for "ATLAS technical design report" and "CMS technical design report" and work your way trough the references in those documents. Once you understand the geometry of the detectors (not a small feat), you can start reading about "trigger algorithms" and "reconstruction algorithms". You may have to pick up a thing or two about particle matter interactions and the GEANT simulation software.

Little warning... it took me almost two years to read trough just the parts that were important to my work...

This post imported from StackExchange Physics at 2014-08-12 09:35 (UCT), posted by SE-user CuriousOne
answered Aug 11, 2014 by CuriousOne (20 points) [ no revision ]
Do you have a particularly good link from one of the searches? This looks pretty awesome, just the kind of thing I was looking for, some real data :)

This post imported from StackExchange Physics at 2014-08-12 09:35 (UCT), posted by SE-user Lance Pollard

If you google "cms algorithm" you are offered a list of algorithms . I picked "jet algorithms" and this is the link  https://twiki.cern.ch/twiki/bin/view/CMSPublic/WorkBookJetAnalysis .

You will see the complexity and continuous changes that we are talking about, just for this one algorithm.

Your answer

Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead.
To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
Please consult the FAQ for as to how to format your post.
This is the answer box; if you want to write a comment instead, please use the 'add comment' button.
Live preview (may slow down editor)   Preview
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
p$\hbar$y$\varnothing$icsOverflow
Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
To avoid this verification in future, please log in or register.




user contributions licensed under cc by-sa 3.0 with attribution required

Your rights
...