HLT Intro page: The LHC will collide bunches of up to 10^{11} protons (p) 40 million times per second to provide 14 TeV proton-proton collisions at a design luminosity of 10^{34} cm^{-2} s^{-1}. The ATLAS read-out system can record 200 events per second. The ATLAS trigger system selects the interesting collisions, reducing 40 MHz bunch crossing rate to the 200 Hz data recording rate. To accomplish this task in real time, the ATLAS trigger system consists of three levels of event selection: Level-1 (L1), Level-2 (L2), and event filter. The L2 and event filter together form the High-Level Trigger (HLT). The University of Geneva has significant . HLT Activities Page: The L1 trigger searches for signatures from high- pT muons, electrons/photons, jets, and tau- leptons decaying into hadrons. It also selects events with large missing transverse energy (E missT ) and large total transverse energy. Using reduced granularity... The maximum L1 accept rate which the detector readout systems can handle is 75 kHz (upgradeable to 100 kHz). The L2 trigger is seeded by Regions-of-Interest (RoI's). These are regions of the detector where the L1 trigger has identified possible trigger objects within the event. The L2 trigger uses RoI information on coordinates, energy, and type of signatures to limit the amount of data which must be transferred from the detector readout. The L2 trigger reduces the event rate to below 3.5 kHz, with an average event processing time of approximately 40 ms. ...RoI Event Picture... The event filter uses offline analysis procedures on fully-built events to further select events down to a rate which can be recorded for subsequent offline analysis. It reduces the event rate to approximately 200 Hz, with an average event processing time of order 4 seconds. The HLT algorithms use the full granularity and precision of calorimeter and muon chamber data, as well as the data from the inner detector, to refine the trigger selections. Better information on energy deposition improves the threshold cuts, while track reconstruction in the inner detector significantly enhances the particle identification, for example distinguishing between electrons and photons. The HLT is almost entirely based on commercially available computers and networking hardware. The computers run Scientific Linux CERN and are interconnected by multi-layer gigabit-Ethernet networks, one for control functionality and another for data movement. The initial deployment of the HLT has ~1100 nodes with 8 CPU's (Xeon E5160 1.86 GHz) and 8 Gb of RAM per node. ...HLT rack picture... The University of Geneva has responsibilities in HLT management and the selection algorithms for the electron, photon, and minimum bias triggers.