Article ID Journal Published Year Pages File Type
1846996 Nuclear Physics B - Proceedings Supplements 2007 4 Pages PDF
Abstract

From september 2007 the LHC accelerator will start its activity and CMS, one of the four experiments, will begin to take data. The CMS computing model is based on the the Grid paradigm where data is deployed and accessed on a number of geographically distributed computing centers. In addition to real data events, a large number of simulated ones will be produced in a similar, distributed manner. Both real and simulated data will be analyzed by physicist, at an expected rate of 100000 jobs per day submitted to the Grid infrastructure.In order to reach these goals, CMS is developing two tools for the workload management (plus a set of services): ProdAgent and CRAB. The ProdAgent deals with MonteCarlo production system: it creates and configures jobs, interacts with the Framework, merges outputs to a reasonable filesize and publishes the simulated data back into CMS data bookkeeping and data location services. CRAB (Cms Remote Analysis Builder) is the tool deployed ad hoc by CMS to access those remote data. CRAB allows a generic user, without specific knowledge of the Grid infrastructure, to access data and perform its analysis as simply as in a local environment. CRAB takes care to interact with all Data Management services, from data discovery and location to output file management.An overview of the current implementation of the components of the CMS workload management is presented in this work.

Related Topics
Physical Sciences and Engineering Physics and Astronomy Nuclear and High Energy Physics