Madevent is a package derived from MadGraph (which is quite stable and has been around for a long time). It produces events according to SM and BSM processes at the parton level, but also interfaces with Pythia. This isn't new. What is unique is that this can all be originated from a web interface.

The web page is: http://madgraph.hep.uiuc.edu/

Detailed instructions from Reinhard, May 24, 2007:

To get started, you should get registered on the registration web page. That will generate a username-password for you which you need to produce any processes.

Then go to the "Generate Process" page and start playing around with it. You can look at the example processes that it provides for you, for

example uu~>dd~ as the input process. You can look at many other example processes as well, for example an electroweak one: ud~>e+ve. To generate something Tevatron or LHC related, you would use pp as the input particles, for example pp>e+ve. Once you enter this you'll see that there are more Feynman diagrams that can produce this process. (Note that as far as madgraph is concerned, p and p contain the same partons.)

After you click "Generate Process", the program will take a little while and then direct you to a page for this process. If you click on the "Process Information" link you get to see all of the relevant processes and links to all of the Feynman diagrams.

Play with these a little bit, and enter different processes that you can think of to get an idea. Examples that I can think of are pp>tt~ or including decays: p p>t t~>b b~w+w-.

Two processes you should look at are the two single top ones: pb>tj, which is t-channel single top quark production, and pp>tb~, which is s-channel single top quark production.

For these last two, you should not only look at the Feynman diagrams but also produce events. Here's how to do that: The button "Code Download" allows you to download a .tar.gz file that does some event generation. Download it. Untar it and follow the instructions in the README file to run it. You should edit the Cards/run_card.dat file and adjust the following parameters:
#*********************************************************************
# Collider type and energy                                           *
#*********************************************************************
        1     = lpp1  ! beam 1 type (0=NO PDF)
       -1     = lpp2  ! beam 2 type (0=NO PDF)
      980     = ebeam1  ! beam 1 energy in GeV
      980     = ebeam2  ! beam 2 energy in GeV 

If you are on one of the Fermilab machines, then you will need to do a "setup D0RunII " in order to get the compiler set up before you generate events. If you are anywhere else in the world, the compiler should already be set up properly. Run the ./bin/generate_events script. You should run serial and give a reasonable name, and then you just wait for events to be generated.

Then when you have an output file, you can use the web interface again to produce a root file and decay the top quark. Go to the "Tools" -> "Decay Interface" page. Browse to the Events/*_unweighted_events.lhe.gz file and load it. Select that you do want root output. Since your two single top files have a top quark in them, select the decay t > b ve e+. Then click on "Upload & Decay". After a while you'll get a page with links, one of which is the root file. Download the root file.

Download the ExRootAnalysis package from the "Downloads" tab. Follow the instructions to produce the library and then run the simple macro given in the Readme file at http://madgraph.hep.uiuc.edu/Downloads/ExRootAnalysis/README in order to produce your first histograms.

Once you have a histogram of jetPT and Mass as in the example macro we should talk again.

These instructions are quite long and it is likely that I missed something somewhere. There are also many steps involving topics you probably aren't familiar yet, and this is supposed to be a learning experience. Let me know if you get stuck or if you have any questions.

Emily Johnson has done a study of single top with CDF-style cuts.

-- ChipBrock - 24 May 2007 -- ReinhardSchwienhorst - 10 Sep 2007

Instructions for New Users of MadEvent and ROOT by Andrew Chegwidden and Jacob Parsons, 3 Aug 2011:

We've created these instructions as a starting point for any student using MadEvent. The scope of these instructions is limited to generating a ROOT file from MadEvent, performing cuts, and creating histograms from code.

They are meant as a starting point from which one can develop a more in-depth analysis of SM and BSM processes at the parton level.

Getting Started and Output Code Generation

The first step in the process is signing up for an account at http://madgraph.hep.uiuc.edu/. Once your account is established you can begin to generate your processes. You have a choice of what model you want MadEvent to follow as well as the process you wish to study. There is a decent list of examples and formats in a link next to the input field. Our initial study focused on the production of the W Boson together with the top quark (p p > w- t). Once you've input your desired process you'll need to select your desired p and j definitions. Our study required p to stand for all possible quarks (the last line of the drop down menu).

Clicking submit will send you to a page where you can download the output code. You will get the appropriate Feynman diagram if it worked properly.

On this page you can download the the output code as a "tar" file which you can use the command line "tar -xzf" to unpack it.

Changing Attributes, Generating Events, Decay Interface, and ROOT Output File

Once the files are unpacked the user has the ability to change certain attributes before you generate events as well as see what the cross sections . In our analysis we changed the beam energies in the ~/Cards/run_card.dat file (lines 38 and 39) to better represent what's coming out of the LHC.

# Number of events and rnd seed                                      *
#*********************************************************************
  10000       = nevents ! Number of unweighted events requested 
      0       = iseed   ! rnd seed (0=assigned automatically=default))
#*********************************************************************
# Collider type and energy                                           *
#*********************************************************************
        1     = lpp1  ! beam 1 type (0=NO PDF)
        1     = lpp2  ! beam 2 type (0=NO PDF)
     3500     = ebeam1  ! beam 1 energy in GeV
     3500     = ebeam2  ! beam 2 energy in GeV

Other cards can be manipulated as the user sees fits. We did not make any other changes for this analysis.

Once the user has made his changes he can generate events from the command line by "./bin/generate_events". We ran a serial run because the calculations were so short. In some instances the ~Cards/param_card.dat file may need to be edited. If this is the case you can utilize MadEvent 's calculators under the Tools heading. One should use this tool as opposed to doing the editing by hand.

Once the events have been generated a locally produced output "LHE" file will be created. If a decay is requested this LHE file can be uploaded to the MadEvent site under Tools-->Decay Interface. You'll need to upload the locally produced LHE file and request a SM model decay. If you only want to look at one decay and would like a ROOT output then just click Upload & Decay. If, however, you would like to further decay the events then you will need to perform multiple decays on successive LHE files.

Once you've finished running the decay process you can download the final ROOT file which you can analyze and use to generate various histograms.

Alternatively, the user could have foregone the whole decay interface procedure and directly inputted the process with final state particles in the "generate process" page. For single top production in the dilepton channel it should look like this:

p p > w- t > b e+ e- ve~ ve

Once the process is generated the user should ./bin/generate_events as normal. Once the events have been generated the user should upload the LHE event file (in the Events directory) and the plot_card.dat file (in the card directory) in the Plotting Interface (ExRootAnalysis) page under the Tools link. This will generate the same .root file as before.

In our analysis we inputted the whole process with final states directly in the generate process box:

(WT in dilepton channel) p p > w- t > e+ e- ve~ ve (TTbar in dilepton channel) p p > t~ t > b b~ e+ e- ve~ ve

Process Information and Results &Feynman Diagrams

Once the tar file has been unpacked and events have been generated an html file is created in the directory called index.html. This file is what is called the "MadEvent Card"on the website. It is specific to the process which the user has inputted. Opening up the file yields links to other files within the directory (it is important to not rename or move files once the code has been unpacked).

Below is an example of an index.html file (links on the page are local directory links)

Clicking on the "Process Information Link" will send the user to a page which shows which processes and subprocesses yield the production of a W- boson and top quark. The user should recall that he told MadEvent that the p (proton) definition should include gluons and the up, down, bottom, and strange quarks/anti-quarks. The strange and bottom quark/anti-quarks are obviously virtual particles which are referred to as the "quark sea" or "sea quarks." The production of the W- boson and top quark does not stem from any processes dealing with virtual particles. Other products such as the production of ttbar are produced as a result of virtual particle interactions. The user can see this for himself by completing the whole procedure again with the p p > t~ t > b b~ e+ e- ve~ ve input process.

This page also has links to the Feynman diagrams. Below are the Feynman Diagrams for WT and TTbar in the dilepton channel

  • W-t Feynman Diagrams (g b > w- t > b e+ e- ve~ ve ):
    gb.png
  • ttbar Feynman Diagrams (b b~ > t t~ > b b~ e+ e- ve~ ve):
    bb.png
  • ttbar Feynman Diagrams (g g > t t~ > b b~ e+ e- ve~ ve ):
    gg.png
  • ttbar Feynman Diagrams (u u~ > t t~ > b b~ e+ e- ve~ ve ):
    qq.png

Also on the Process Information page is a link to the proc_card_mg5.dat file which gives an overview of the model, input process, and p/j definitions requested by the user.

Results and Event Database

From the index.html file the user can navigate to the Results and Event Database page. This page shows the total cross section for the process as well as the run name, information on the collider, and the number of events generated. Clicking on the "results" link sends the user to a page which again shows the user the total cross section. The P0_gb_wmt link navigates the user to the Feyman diagrams (two in this case as a result of two subprocesses). Clicking on the link under the Cross Sect lets the user see the cross section of the two subprocesses.

Generating Histograms From Code

(please note this was our initial code generation; the code has changed considerably since)

Specific histograms can be created by utilizing c++ code to create macros. This is most easily done by utilizing the ROOT command line to open up the ROOT file. Once open, use the .ls command to view the list of trees within the ROOT file. Usually there are only 1 or 2 trees within the file. Use the MakeClass command on the tree you want to work on (for example, one of the trees we found using .ls was called "LHEF", so we entered "LHEF->MakeClass("myMacro")" where "myMacro" is the name you choose to give the .h/.C files outputted by the process. In this example the files myMacro.h and myMacro.C will be created automatically). The .h file's purpose is to associate the raw data of the root file with variables the user can work with, the .C file is used to loop through each event containing these variables. After spending some time figuring out the internal structure of these files, the user can edit them to create histograms specific to desired particles/properties/cuts/etc.

An additional file (with extension .C) must be created from scratch. In this file, an object of the class defined in the .h/.C files must be created and the Loop function defined in the original .C file must be applied to the object. Make sure that all four files (the original root file, the auto-generated .h/.C files, and the final .C file) are in the same directory and then execute the final .C file as a macro in ROOT. The attached files take the input root file and generates a histogram of electron transverse momentum. The attached files have specific comments embedded in the code:

Any number of histograms can be generated using this code as a format. Additionally, users can make cuts and loop over all events.

One such histogram which the user may be interested in creating is the MET (missing transverse energy). Neutrinos pass right through a real detector leaving missing energy. This missing energy can be calculated. MadEvent, which is an ideal detector simulator, tracks these neutrinos. The user could utilize 4-vectors to determine the transverse energy of these neutrinos within MadEvent which would be missing within a real detector.

Once you become familiar with utilizing the computational advantages of MadEvent and developing ROOT macros you can begin to filter out any background which may obsure the specific process you wish to study.

In MadEvent the user asks the program to create events involving only one process with subsequent decays into a final state. Our initial study concentrated on the production of a W Boson together with a top quark (W- t) with a final state consisting of two leptons (electron and positron), two electron neutrinos, and one jet. Other processes may leave very similar final states. Consider the production of a top quark along with a top anti-quark (ttbar). If both W Bosons (W+ from top quark decay and W- from top anti-quark decay) decay into leptons and neutrinos (dileptonic channel) then the final state is exactly the same as the W- t process except for the presence of another jet (the hadronization of the bbar which came from the tbar decay). Producing the same histograms for the ttbar process would allow the user to look for possible differences between the two processes and may allow the user to subtract out the background (the ttbar process). All of this is transparent within MadEvent but not a physical detector. Herein lies MadEvent 's usefulness.

Loop Over Events

It may be important to define what MadGraph consideres an "event." Within MadGraph an "event" is the collision of two incoming particles (from the two beams) and includes all the intermediate and final state particles' information. In short, consider an event as being one single collision and everything that happens afterwards, of which you have complete knowledge of all mechanical information.

The ROOT framework is designed to read over all the branches of the ROOT tree once. This is the most efficient way to generate histograms. If the user chooses to create a function within his code that creates and fills histograms then he should create that function so that it only reads the branches once (possibly twice if he wants to first determine what histograms to generate). We initially created code that read over all events everytime the "Make Histogram" function was called. This slowed our compiliing time down considerably.

Cuts on Particles

When defining a jet the user should require that the final state quarks should have energy greater than 25GeV. However, as the user can see from the Feynman diagrams of the g b > w- t process (see above for directions to these diagrams), there is a bottom quark in the initial state. It is the initial interaction of a bottom quark and a gluon that gave rise to all the data. There is also a bottom quark as a final state. It is this final state bottom quark to which we wish to apply the cuts. If the user naively inputs the statement "if bottom quark PT is greater than 25Gev...." he will accidently make cuts on the initial state bottom quark as well as the final state bottom quark.

ROOT identifies initial particles (in this case the bottom quark and gluon) as having a status number of -1. Intermediate particles (in this case the W &, W+ bosons and the top quark as having a status number of 2. Final state particles (in this case the leptons and neutrinos) as having a status number of 1.

So the user needs to be cognizant of where in the interaction/decay chain he wishes to make cuts and utilize the "Particle.Status" variable to implement those cuts.

In our code we instantiated TLorentzVectors corresponding to particle PID numbers before filling any histograms or performning any functions. We then utilized the information in these vectors to fill our histograms. The advantage of doing this is that we were able to place cuts on the Particle.Status variable early on without the danger of including (or excluding) data later on in the code.

Proper Normalization(s)

When comparing histograms from two different processes it is necessary to normalize the histograms in such a manner that the process that happens more frequently (higher cross section) is more heavily weighted. A simple way to accomplish this is to normalize the histograms to the cross section (as opposed to unity or any other familiar normalization). The user could use the scaling method in his code to require that the total integral of the histogram equal the cross section of that process:
Double_t scale = CrossSection/myhist->Integral();
myhist->Scale(scale)

Similarly the user could weight the entries used to fill the histograms. This can be done by utilizing the "Fill" method. The second argument in this method is the weighting factor. If the user uses the total cross section and divides by the number of entries (number of entries before any cuts have been applied) then data coming from the less frequent event will be weighted less.
myhist->Fill(WT_MET, 51.623/10000);
myhist->Fill(TTbar_MET, 922.14/10000);

In this example a histogram for MET is filled for the WT and TTbar processes. The TTbar process has a much higher cross section (922.1 fb compared to 51.6 fb for that of WT). Each process had 10,000 events as dictated in the run_card.dat file. The user should note that he is already at a disadvantage because his signal occurs at a much smaller cross section than the background he wishes to reduce.

Once the histograms have been properly filled/normalized then the user can place both histograms on the same canvas:

* MET of WT (red) and TTbar (black) at 7TeV Beam Energy:
MET.png

ROOT Documentation and Useful Websites

The ROOT homepage has links to documentation pages which may be useful to the user. The user's guide as well as the class index can be extremely usefull. The class index page has an alphebetical class listing. Clicking on a specific class brings the user to a page showing the function members and methods of that class. The user should be sure to click the "Show inherited" option in the box at the top right of the page.

Useful Sites:

-- AndrewChegwidden - 17 Aug 2011 -- JakeParsons - 17 Aug 2011

Important Updated Information for Users of MadGraph/ROOT by Elizabeth Drueke and Brad Schoenrock, 6 June, 2013:

MadGraph5 is a Monte Carlo event generator for parton-level decay and collision processes at high-energy colliders, such as the LHC. In this project, it was used to develop code to isolate t channel events in the presence of various background decays.

Getting Started with MadGraph5

Prior to using the MadGraph5 software to generate processes, you will first need to register an account online from the MadGraph home page. This is a relatively quick and simple process, at the end of which you will receive an email with a password. Once you have received your password, you will not only have access to the MadGraph generation software, but you will also have access to a personal database recording all of the processes you have generated to date.

Generating Processes

To generate a process, visit the MadGraph site and use section I of the home page. All of our processes were generated using the Standard Model, though there are other options, including SM with CKM and SM + DY resonances, among others.

Input the process you would like. Keep in mind that every particle and every arrow of your input must be separated by a space. Antiquarks are specified by a tilde (~). Additionally, the W must be specified as either positive or negative. For example, one of our W+ Jet inputs was p p > w+ j j, w+ > e+ ve. Commas allow you to specify further decay. However, this may also be accomplished with the use of additional arrows. For example, when generating our W+Z decay for our background analysis, we used p p > w+ z > e+ ve j j. MadGraph automatically assigns the jets to the Z and the lepton-neutrino pair to the W+.

Next, you must set your p and j definitions. We took j to be a down, up, strange, charm, bottom, or one of their antiquarks. You must also choose your lepton definition. In our study, we specified all leptons to be electrons (e-) or positrons (e+). That is, we did not use the lepton symbol (“l”) available in MadGraph. However, it is also possible to, in your input, replace these with a symbol for any lepton (l+, l-). Then, from the lepton drop-down, choose which leptons you wish MadGraph to consider in its generation. Once you have completed all of these steps, you may submit your process to the Monte Carlo generator.

The process will usually take a few minutes to generate. If the process is generating, there will be an image of spinning wheels. If, however, there is a problem with the generation, a stop sign will come up. By clicking the Generation progress/status button, you can look into the errors that arose while the process was being generated. An example of an error is a lack of charge specification (ie. neglecting to specify W+ or W-). These failed generations will still remain in your database for future reference.

Once your process has successfully been generated, you may wish to click the Process Information link. This will allow you to look at either a postscript or html file of the Feynman diagrams generated by MadGraph for your process. This is a good way to ensure that the correct process was generated. The Feynman diagrams are discussed in greater detail above.

Downloading and Editing the Generation File
Once you have ascertained that your process is correct, you can download a .tar file by clicking the Code Download link. It is highly recommended, particularly if you are working with multiple processes at a time, that you create a directory for each process. Download the .tar to this directory. You can unzip the file with the command “gzip –d madevent.tar.gzip” and untar the file with “tar –xf madevent.tar”.

Once you have successfully done this, you will have access to a variety of directories. By going into the Cards directory, you can edit the run_card.dat, which controls the generation of events. In our study, we generated 1000000 events (change nevents, L33) at a beam energy of 4000 GeV per beam (ebeam1, L42; ebeam2, L43). There are a variety of other variables on the run_card that you may want to investigate for your studies.

#*********************************************************************     
       1000000 = nevents ! Number of unweighted events requested                                                                                                                      
       0       = iseed   ! rnd seed (0=assigned automatically=default))                       
#*********************************************************************
# Collider type and energy
# lpp: 0=No PDF, 1=proton, -1=antiproton, 2=photon from proton,                                                                                                                        
# 3=photon from electron                                                                                                                                     
#*********************************************************************                                                                                                                                                      
       1       = lpp1    ! beam 1 type                                                                                                                                                                
       1       = lpp2    ! beam 2 type                                                                                                                                                          
       4000    = ebeam1  ! beam 1 total energy in GeV                                                                                                                                               
       4000    = ebeam2  ! beam 2 total energy in GeV                                                                    
 #*********************************************************************                                      
Generating Events

Once you have made the necessary changes to the run_card.dat file, it is time to generate your events. To do this, simply go into the bin directory and run ./generate_events. It is also possible to use bash scripting to submit the event generation to condor so as not to clog up your home network. The event generation process takes anywhere from 15 minutes to a couple of hours, depending upon the process you are generating.

When the events are fully generated, a crossx.html file will appear in your directory. This page will tell you the cross section of your process in picobarns (pb) and contains a link to a downloadable .LHE file of your events. Download this file and convert it to a .root file.

Using ROOT
When MadGraph generates events, it looks at every p p interaction and creates hypothetical data on nearly every aspect of that process. It records information such as Pt, Eta, Phi, mass, Status, etc. for every particle which takes part in the process. So, for example, in our t channel generation, we would have access to the Pt, Eta, and Phi of the top, b bar, jet, lepton-neutrino, b, and our initial state particles. There will also be events generated for various jets depending on our jet definition. Using the particle ID feature will allow you access to the type of quark forming the jet.

From here, you may use the root TBrowser to view basic histograms of the data generated by MadGraph. If you wish to perform further analysis, you will have to write a program in C++ which loops over events and particles in the process.

To incorporate the root file into your code, you will have to define a TFile and a TTree.

TFile *rootfile=new TFile(?unweighted_events.root?,?read?);                                                                                                                 
TTree (TTree* LHEF=(TTree*)rootfile->Get(?LHEF?);

To write your code, you may want to look at the root documentation.

When generating histograms, it is important to normalize them to ensure that the data from the detector is properly represented. In our analysis, we scaled all of the histograms by the cross section of the process divided by the total events generated (1000000). Our scaling process is discussed in greater detail below.

Our Analysis
Our study, as aforementioned, was on the t channel and isolating this decay against other background decays. The MadGraph processes we generated can be accessed below. It is important to note that, with few exceptions, we generated only positive electron events, so our final results may be biased due to the asymmetry of the t channel asymmetry. Within our code, we filled histograms with truth-level information using the particle ID leaf of the unweighted_events.root file. Then we cut and clustered our jets as described below. Next, we reconstructed the neutrino, W, and top using four-vectors, making more cuts, as described below.

When making the histograms, we made sure to scale them according to the cross section over the number of events generated in order to ensure that the integral of the histogram was equivalent to the luminosity. It is also possible to scale your histograms to a given luminosity. This process is described in detail at https://hep.pa.msu.edu/wiki/AtlasSingleTop/EventWeights. We also made histograms scaled to a 20/fb luminosity.

MadGraph Processes Generated
  • t channel 2-3 process
  • p p > b~ t j, t > b e+ ve
  • p p > b t~ j, t~ > b~ e- ve~
  • ttbar Fully Leptonic
  • p p > t t~, t~ > e- ve~ b~, t > e+ ve b
  • ttbar Semi Leptonic
  • p p > t t~, t > e+ ve b, t~ > b~ j j
  • Z Plus Jets
  • p p > z j j, z > e+ e-
  • p p > z j j j, z > e+ e-
  • W Plus Jets
  • p p > w- j j, w- > e- ve~
  • p p > w- j j j, w- > e- ve~
  • p p > w+ j j, w+ e+ ve
  • p p > w+ j j j, w+ > e+ ve
  • s channel
  • p p > t b~, t > e+ ve b
  • Wt Channel
  • p p > ve~ e- t, t > b j j
  • p p > j j t, t > ve e+ b
  • WW Decay
  • p p > w+ w-, w+ > j j, w- > e- ve~
  • p p > w+ w-, w- > j j, w+ > e+ ve
  • WZ Decay
  • p p > w+ z> e+ ve j j
  • p p > w+ z > e+ e- j j
  • p p > w- z > e+ e- j j
  • p p > z w- > e- ve~ j j
  • ZZ Decay
  • p p > z z > e+ e- j j
Pre-Reconstruction Cuts
  • Required at least one b
  • Required lepton Pt >25 GeV and |Eta|<2.5
  • Required lepton dR>0.2 with all other particles
  • Required number of leptons be 1
  • Required positive lepton (positron)
  • Clustered jets with dR < 0.4
  • Required jet Pt>25 GeV and |Eta|<4.5
  • Required MET Pt > 25GeV

Post-Reconstruction Cuts
  • Reconstructed Neutrino assuming the W mass to be 80.4 and Lorentz invariance using four-vectors
  • Required neutrino |Pz| < 150 GeV
  • Reconstructed W
  • Required reconstructed W Pt > 250 GeV and |Eta| < 4
  • Reconstructed top
  • Required reconstructed top mass to be > 166 GeV and < 180 GeV

-- ElizabethDrueke - 06 Jun 2013

Important Updated Information for Users of MadGraph /ROOT by Elizabeth Drueke, 17 June, 2014:

Here, the MadGraph /ROOT framework discussed above is used to study colored resonance particles at 8 TeV and 14 TeV. Here, MadGraph5_v1_5_10 was used. This is not the most recently updated version of MadGraph. The newest version does not utilize the Template directory, as was done here. Instead, all processes are run interactively. If you wish to download MadGraph rather than run from the website as cited above, you can download the latest version here after setting up your account.


Colored Resonance Particles

In our analysis, we investigated several different colored resonance particles -- the W' boson, Kaluza-Klein gluon, coloron, and colored octet. Each particle has its own unique model. The W' model can be found here. There is an additional wiki page (ATLAS internal) which explains the generation process and the parameters used in this W' paper and this one. In this case, the only parameters we changed were the gL/gR value, the mass of the W', and the width of the W'. We used masses at 250 GeV intervals from 500 GeV to 3 TeV throughout the analysis. The widths used can be found in the wiki or the referenced papers. They are provided by Zack Sullivan. The gL/gR parameter is perhaps the most difficult of the parameters we worked with. In fact, the gL/gR is actually the g' coupling. That is, it is the gL/gR value multiplied by the standard model coupling. When working with the left-handed W', we then set gR to 0 and gL to 1*6.483972e-01. Alternatively, when working with the right-handed W', we set gL to 0 and gR to 1*6.483972e-01. This analysis studied both left- and right-handed W's separately.

The other models are not as well documented. The Kaluza-Klein gluon model was generated in-house for this analysis. However, there is a KKg model available here. In this model the width of the KKg is scaled with the mass, so the mass is the only parameter which it is necessary to change. The in-house model does not scale the widths directly with mass.

The coloron model documentation is available online, though the model itself is not necessarily on a website where it can be easily accessed by the public. The colored octet model (more...). All model file changes can be made within the parameters.py file of the model directory OR, alternatively, through the param_card.dat generated by running the ./bin/new_process command in the working directory. If this method is chosen, any and all changes must be made to the card after the process is generated but before the events are generated so that the changes are not written over by the new_process command.

The aim of the analysis is to study colored resonance particles decaying to similar final states, in our case l $nu$ b j. Thus, the processes generated were:

W':
generate p p > wp- > w- b b~, w- > l- vl~ @1
add process p p > wp+ > w+ b b~, w+ > l+ vl @2

KKg:
generate p p > kkg > b~ c l- vl~ @1
add process p p > kkg > b c~ l+ vl @2

Coloron:
generate p p > GH , GH > b c~ l+ vl @1 GHT=1 QED=2
add process p p > GH , GH > b~ c l- vl~ @2 GHT=1 QED=2
add process p p > GH g, GH > b c~ l+ vl @3 GHT=1 QED=2
add process p p > GH g, GH > b~ c l- vl~ @4 GHT=1 QED=2

Colored Octet:
generate p p > t b, t > b l+ vl @1
add process p p > t~ b~, t~ > b~ l- vl~ @2

Here, it the last two lines of the coloron generation code are a double-production mechanism by way of an extra gluon.

Additional Parameters

In addition to the model parameters, the parameters in the run_card.dat were modified in this study. We generated for each mass and energy 1,000,000 events, which is specified in the run_card as nevents. Additionally, we changed the energy to 8 TeV and 14 TeV. This is done by setting ebeam1 and ebeam2 each equal to half of the desired c.m. energy. For example, when generating 8 TeV samples, we set ebeam1 = ebeam2 = 4000.

Two more changes were made to the run_card. The first is the changing of the renormalization scales from the default 91.1880 to the mass of the particle in GeV. This is accomplished by assigning scale, dsqrt_q2fact1, and dsqrt_q2fact2 all to the mass of the particle, and then changing the fixed_ren_scale and fixed_fac_scale parameters from F to T (indicating you indeed want to use the numbers you are putting in).

The last change was generating the particles without cuts. This involves changing a good number of the parameters available in the run_card. First, one must set auto_ptj_mjj and cut_decays to F (indicating we are not using the standard MadGraph cuts). This should be enough to run the generation; however, to be safe we also set all cuts within the file to either 0 or -1, which indicates no cut will be placed on the variable in question.

The coloron samples, because of the double production mode, were found to encounter soft divergences when run without any MadGraph cuts. This is due to the extra gluon emitted in the decay. In order to remedy that issue, the coloron samples only were generated with the standard MadGraph cuts. It is anticipated that any difference in the cross sections generated by MadGraph for this particle with respect to the others will be remedied through the cuts made in the analysis.

Generating Events

This section will briefly describe generating events in the downloaded MadGraph directory from a terminal. After downloading MadGraph, untar the file and enter the directory. Once there, create the directory you want to work in. You will need a different directory for every process, as if you attempt to run two processes in the same directory you will get an error message. Enter the Template directory and copy its contents into your work directory. Then go into the Cards folder of your work directory. Change the proc_card and the run_card as appropriate. Remember to include the importing of any model you use that is not the standard model in your proc_card. Additionally, it is recommended that you include an @# symbol after every new process line as shown above. Once you have changed your run_card and proc_card as you'd like them, save them and exit the Cards subdirectory. Then run ./bin/newprocess_mg5. This will generate the Feynman diagrams for the processes you entered into the proc_card, which you can access at the index.html file.

The next step is to run ./bin/generate_events to create events in the Events subdirectory. However, there are several things you may wish to do/check between running your new process and generating events. If you wanted to change parameters in the model by changing the param_card.dat, you should do it here. The param_card.dat is found in the Cards subdirectory. Additionally, if you chose to include b b~ in the p, q, or j definitions in the proc_card, you should make sure they are still there. Sometimes that change will be deleted or changed back when the new process is generated. The last thing to check, particularly if you plan to submit the event generation to condor, is that you change the me5_configuration file by specifying the run_mode = 0. The default will run on every available condor node, which clogs the nodes and slows the jobs substantially when you are trying to run multiple at a time. Setting this parameter to 0 forces the jobs to run on one node, which is faster particularly with multiple jobs. When you are sure all of your parameters are as you'd like them to be, run the ./bin/generate_events command.

When this is done running, a directory will be created in the Events subdirectory of your working directory called run_01. Inside, you will find an events.lhe.gz and an unweighted_events.lhe.gz file, as well as a .txt file outlining the generation process. In order to use these files, you must unzip them and convert them to .root files. In addition to this, in the work directory you will find a crossx.html file outlining in more detail the generation and the contribution of the various Feynman diagrams to the total cross section.

Some Notes on Special Generations

Throughout this process, and especially in the case in the W' samples, it became important to specify events be on- and off-shell, include, SM processes, etc. This section will provide a brief description of how to generate these special processes. Further elaboration is available in this presentation by Olivier Mattelaer.

Mattelaer's presentation of these topics, in summary, states that, for the p p > z > e+ e- process,

p p > e+ e- gives the correct distribution
p p > z, z > e+ e- gives the on-shell distribution only
p p > e+ e- /z gives the distribution without z-production
p p > e+ e- $z gives the off-shell distribution only

This came into affect with our W' samples as we aimed to investigate the contribution of off-shell top quarks in the decay. We then used

Correct/Desired Distribution:
generate p p > wp- > w- b b~, w- > l- vl~ @1
add process p p > wp+ > w+ b b~, w+ > l+ vl @2

On-Shell Tops Only:
generate p p > wp- > t~ b, (t~ > w- b~, (w- > l- vl~)) @1
add process p p > wp+ > t b~, (t > w+ b, (w+ > l+ vl)) @2

No Top Production:
generate p p > wp- > w- b b~, w- > l- vl~ / t t~ @1
add process p p > wp+ > w+ b b~, w+ > l+ vl / t t~ @2

Off-Shell Tops Only:
generate p p > wp- > w- b b~, w- > l- vl~ $t t~ @1
add process p p > wp+ > w+ b b~, w+ > l+ vl $t t~ @2

The "No Top Production" was not looked into in-depth; however, we did look into the kinematics and cross sections of the other three available options. This lead us to believe that the off-shell top production in the left-handed W' samples may contribute to the cross sections of the events in such a way that, at higher masses, the left-handed cross section is significantly higher than the right-handed cross section.

In addition, we desired to investigate the interference of the W' with SM production. To do this, we generated W' with W-production and W production by itself. The codes used for these processes are:

W' Production + SM:
generate  p p > t b~, t > b l+ vl @1 QED=4
add process p p > t~ b, t~ > b~ l- vl~ @2 QED=4

SM Production Only:
generate  p p > w+ > t b~, t > b l+ vl @1 QED=4 NP=0 
add process p p > w- > t~ b, t~ > b~ l- vl~ @2 QED=4 NP=0

Here, QED specifies the maximum number of allowed vertices and NP specifies the maximum amount of new physics particles allowed in the process (ie. NP=0 for the SM production because we didn't want W's to be generated in these samples). We also investigated creating W' Production + SM with on- and off-shell tops. The code which appears to do this is:

generate p p > b~ b l+ vl /h a z g @1 QED=4 NP=2
add process p p > b b~ l- vl~ /h a z g @2 QED=4 NP=2

Here the /h a z g prevents the contribution of Feynman diagrams involving the Higgs, Z, photon, and gluon production.

-- ElizabethDrueke - 25 Jun 2014

Running LHAPDF with MadGraph

Occasionally it is desirable to generate events or cross sections using different PDFs (other than the default CTEQ6L1). This section will outline how to get MadGraph running with the LHAPDF interface. Information on the interface can be found here.

In this example, I will explain running the interface with MadGraph5.v1.5.10 and LHAPDF-5.9.1 (which can be downloaded here). Once you unpack and unzip both the MadGraph and lhapdf files, put the lhapdf directory into the MadGraph directory. Enter the lhapdf directory and run the following commands:
./configure
make
make install
./configure --prefix=/my/madgraph/directory/
make
make install

From here, you're ready to decide which processes you'd like to generate. Go back into the MadGraph directory and copy the Template directory to your working directory. Once again, it is necessary to enter the lhapdf directory and run:
./configure --prefix=/my/madgraph/directory/working/
make
make install

This will create required files within your working directory which are necessary for successful implementation of the interface. Go back to your working directory. You should find a README.lhapdf file. Open it and change the run_card.dat as instructed. You will have to add the lhaid line underneath the already present pdf line. Next, you can follow in instructions above to change the run_card to specify any other parameters you may want and to change the proc_card to generate the specific process(es) you're interested in.

Next, it will be important to look at the list of available lhaids, which can be found here. Using the getdata command, specify which of the pdf sets you'll want to use. That is, in your working directory, run
./bin/lhapdf-getdata PDFSETS

In order to make everything completely ready to run, you must now run
mv *.LHgrid share/lhapdf
cp -r share/lhapdf/ lib/PDFsets

In the last line, you are actually creating the PDFsets directory. Now you can run your new_process command to generate diagrams for the process(es) you specified in your proc_card. The last step is to appropriately set your lhaid by following the chart here. Now you should be set to run your events and generate cross sections.

This method was used in calculating pdf uncertainties according to the practical guide. However, certain changes were made. The updated CT10 and NNPDF23 were used in place of CTEQ6 and NNPDF20. Additionally, an alpha_s value of 0.118 was used throughout.

-- ElizabethDrueke - 19 Aug 2014
Topic attachments
I Attachment Action Size Date Who Comment
Histograms_Lime_2.pdfpdf Histograms_Lime_2.pdf manage 128.5 K 27 Jul 2011 - 18:55 AndrewChegwidden  
Histograms_Lime_3.pdfpdf Histograms_Lime_3.pdf manage 431.4 K 27 Jul 2011 - 18:55 AndrewChegwidden  
Histograms_Lime_4.pdfpdf Histograms_Lime_4.pdf manage 249.5 K 27 Jul 2011 - 18:56 AndrewChegwidden  
Lime.rootroot Lime.root manage 100.2 K 27 Jul 2011 - 18:55 AndrewChegwidden Output ROOT file to be run in ROOT
LimeClass.CC LimeClass.C manage 9.2 K 27 Jul 2011 - 18:54 AndrewChegwidden Loop through each event
LimeClass.hh LimeClass.h manage 12.1 K 27 Jul 2011 - 18:54 AndrewChegwidden Associate raw data with defined variables
LimeEx.CC LimeEx.C manage 0.2 K 27 Jul 2011 - 18:54 AndrewChegwidden Executable to be run in ROOT
LimePre.rootroot LimePre.root manage 5937.5 K 27 Jul 2011 - 18:55 AndrewChegwidden Input ROOT file from MadEvent
WTFeynmanDiagrams.psps WTFeynmanDiagrams.ps manage 12.9 K 03 Aug 2011 - 17:52 AndrewChegwidden W-t Feynman Diagrams
input.rootroot input.root manage 5937.5 K 12 Jul 2011 - 20:29 AndrewChegwidden Input ROOT file from MadEvent
myMacro.CC myMacro.C manage 2.3 K 12 Jul 2011 - 20:30 AndrewChegwidden Loop through each event
myMacro.hh myMacro.h manage 8.7 K 12 Jul 2011 - 20:29 AndrewChegwidden Associate raw data with defined variables
myMacroEx.CC myMacroEx.C manage 0.5 K 12 Jul 2011 - 20:30 AndrewChegwidden Executable to be run in ROOT
Topic revision: r46 - 19 Aug 2014, ElizabethDrueke
 

This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding Foswiki? Send feedback