Article ID Journal Published Year Pages File Type
1847648 Nuclear Physics B - Proceedings Supplements 2009 4 Pages PDF
Abstract

The CMS experiment at LHC has had a distributed computing model since early in the project plan. The geographically distributed computing system is based on a hierarchy of tiered regional computing centers; data reconstructed at Tier-0 are then distributed and archived at Tier-1 where re-reconstruction on data events is performed and computing resources for skimming and selection are provided.The Tier-2 centers are the primary location for analysis activities. The analysis will be thus performed in a distributed way using Grid infrastructure. The CMS computing model architecture has also the goal to enable thousands physicist collaboration worldwide spread (about 2600 from 180 scientific institutes) to access data. In order to require to the end user a very limited knowledge of underlying technical details, CMS has been developed a set of specific tools using the Grid services.This model is being tested in many Grid Service Challenges of increasing complexity, coordinated with the Worldwide LHC Computing Grid community. In this talk the status, plans, and prospects for CMS analysis using the Grid are presented.

Related Topics
Physical Sciences and Engineering Physics and Astronomy Nuclear and High Energy Physics