کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
1846996 1528122 2007 4 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
CMS workload management
موضوعات مرتبط
مهندسی و علوم پایه فیزیک و نجوم فیزیک هسته ای و انرژی بالا
پیش نمایش صفحه اول مقاله
CMS workload management
چکیده انگلیسی

From september 2007 the LHC accelerator will start its activity and CMS, one of the four experiments, will begin to take data. The CMS computing model is based on the the Grid paradigm where data is deployed and accessed on a number of geographically distributed computing centers. In addition to real data events, a large number of simulated ones will be produced in a similar, distributed manner. Both real and simulated data will be analyzed by physicist, at an expected rate of 100000 jobs per day submitted to the Grid infrastructure.In order to reach these goals, CMS is developing two tools for the workload management (plus a set of services): ProdAgent and CRAB. The ProdAgent deals with MonteCarlo production system: it creates and configures jobs, interacts with the Framework, merges outputs to a reasonable filesize and publishes the simulated data back into CMS data bookkeeping and data location services. CRAB (Cms Remote Analysis Builder) is the tool deployed ad hoc by CMS to access those remote data. CRAB allows a generic user, without specific knowledge of the Grid infrastructure, to access data and perform its analysis as simply as in a local environment. CRAB takes care to interact with all Data Management services, from data discovery and location to output file management.An overview of the current implementation of the components of the CMS workload management is presented in this work.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Nuclear Physics B - Proceedings Supplements - Volume 172, October 2007, Pages 141-144