GSI Forum
GSI Helmholtzzentrum für Schwerionenforschung

Home » PANDA » PANDA - Computing » Grid and Infrastructure » Upcoming DC
Re: Upcoming DC [message #6324 is a reply to message #6323] Wed, 09 April 2008 16:45 Go to previous messageGo to next message
Jens Sören Lange is currently offline  Jens Sören Lange
Messages: 193
Registered: June 2005
first-grade participant
From: *physik.uni-giessen.de
Hi all,

please wait a bit before checking out the svn.

I have overlooked that I have hidden svn files in the directories (because I copied whole directories before) so when committing the svn overwrote _other_ directories (!). Mohammad is trying to fix it right now.

apologies, Soeren
icon4.gif  Test Jobs [message #6325 is a reply to message #6203] Wed, 09 April 2008 17:06 Go to previous messageGo to next message
Dan Protopopescu is currently offline  Dan Protopopescu
Messages: 55
Registered: September 2007
Location: Glasgow, UK
continuous participant

From: *physics.gla.ac.uk
Dear all,

In the test job JDLs please make sure they save output (if the case) to a directory writable by the production user 'pbarprod',
even if you try the job as yourself or as 'pbartest'. We will use 'pbarprod' for the data challenge next week.

Of course, I will doublecheck the JDLs proposed for the DC. Smile
icon3.gif  Informative Titles [message #6327 is a reply to message #6203] Wed, 09 April 2008 17:09 Go to previous messageGo to next message
Dan Protopopescu is currently offline  Dan Protopopescu
Messages: 55
Registered: September 2007
Location: Glasgow, UK
continuous participant

From: *physics.gla.ac.uk
I also propose to use for our posts more informative titles than the default 'Re: Upcoming DC'. Cool
tpc and stt DC macros are in svn [message #6328 is a reply to message #6327] Wed, 09 April 2008 17:16 Go to previous messageGo to next message
Jens Sören Lange is currently offline  Jens Sören Lange
Messages: 193
Registered: June 2005
first-grade participant
From: *physik.uni-giessen.de
rev2450 contains now pandaroot/macro/dc1
UrQMD_Smm [message #6333 is a reply to message #6325] Wed, 09 April 2008 22:01 Go to previous messageGo to next message
Johan Messchendorp is currently offline  Johan Messchendorp
Messages: 693
Registered: April 2007
Location: University of Groningen
first-grade participant

From: *xs4all.nl
Hi,

I installed a 32 bit and a 64 bit version (binaries with libs only) of the UrQMDSmm eventgenerator on the Grid, compiled against root5.18 (which version is the most actual one for the external packages). It seems to work for the sites I could test it on. To test it on any site:

Installation (presently already instaled at KVI,GSI,Juelich,Dubna,Bucharest)
---------------------------------------------------------
packman install pbarprod@urqmd::r2008

test scripts
---------
/panda/user/p/pbarprod/bin/testurqmd.sh
/panda/user/p/pbarprod/jdl/testurqmd.jdl

i.e.

submit /panda/user/p/pbarprod/jdl/testurqmd.jdl <id> <momentum> <nrofevents>

will do the job with the output @

/panda/user/p/pbarprod/jgm/urqmd/run<id>/..


Johan.
(ps thanks to Aida and Vladimir!)

[Updated on: Wed, 09 April 2008 22:38]

Report message to a moderator

Storage@KVI [message #6334 is a reply to message #6203] Wed, 09 April 2008 22:13 Go to previous messageGo to next message
Johan Messchendorp is currently offline  Johan Messchendorp
Messages: 693
Registered: April 2007
Location: University of Groningen
first-grade participant

From: *xs4all.nl
Hi,

I have made 2TByte storage space (temporarily) available at KVI in case needed for the DC. This space should already be accessible via the SE.

Johan.

[Updated on: Wed, 09 April 2008 22:19]

Report message to a moderator

Pledged resources on our DC01 wiki [message #6344 is a reply to message #6323] Thu, 10 April 2008 15:02 Go to previous messageGo to next message
Dan Protopopescu is currently offline  Dan Protopopescu
Messages: 55
Registered: September 2007
Location: Glasgow, UK
continuous participant

From: *physics.gla.ac.uk
Dear all,
I added tables with pledged resources on our wiki:
http://panda-wiki.gsi.de/cgi-bin/viewauth/Computing/DataChallenge1
Could you please help update these numbers for all the sites ?
Could we also log data on the package installation results ?
one more week left [message #6346 is a reply to message #6325] Thu, 10 April 2008 16:11 Go to previous messageGo to next message
Paul Buehler is currently offline  Paul Buehler
Messages: 11
Registered: October 2007
occasional visitor
From: *smi.oeaw.ac.at
Dear colleagues,

I prepared a schedule for the days remaining until the start of the Data Challenge (summarizing what has been discussed so far). It is displayed on http://panda-wiki.gsi.de/cgi-bin/view/Computing/DataChallenge1#Schedule. Inputs, comments are highly appreciated. I will try to coordinate the final steps to be able to run the jobs on 17.-19. April.

The packages will have to be finalized very soon, installed and testd on the sites, and we have to decide on the macros to run. As a first step I would like to grasp the current status. Could you please help with this?



* Packages:
Can you Soeren, please summarize the status of the software. What is the current status, what is left to do?

* Macros:
The latest proposal by Soeren was (or did I miss something?)

--------------------------------------------
stt
===
root -b runsim.C"(nEvents,pT)"
root -b rundigi.C
root -b runreco.C

pT=30,40,50,...100 MeV/c
pT=100,200,300,...,1000 MeV/c
pT=1,2,3,...,7.5 GeV/c
10,000,000 events each

tpc
===
root -b run_sim_tpcmvd.C"(nEvents,pT)"
root -b run_rectrack_tpcmvd.C

pT=30,40,50,...100 MeV/c
pT=100,200,300,...,1000 MeV/c
pT=1,2,3,...,7.5 GeV/c
10,000,000 events each

dpm
===
Events and beam momenta for dpm will be proposed tomorrow
--------------------------------------------

Soeren, do you want to update?

What about the UrQMD and the fast sim macros Johan has already installed and tested on the grid?

Could you, Soeren and Johan, please try to make a concise list of possible simulations to run, based on the packages which will be available for the DC01?

Can you Johan please summarize the status of the macros and the jdls. Examples for UrQMD and the fast sim are obviously ready. What about the other simulations (stt, tps, dpm)? Will you be able to take care of this?


* Storage:
One event is 250 Bytes (the number was given by Soeren). Is this true for any of the prposed simulations (UrQMD?)? If yes then we most probably have far enough storage capacity (1TB ~ 4.E9 events).

* CPU time
Can somone give a (rough) estimate of the CPU time used per event? This number would help to decide on the number of events to submit (per job, total).

Cheers,

Paul
installation of relevant packages [message #6353 is a reply to message #6346] Thu, 10 April 2008 17:03 Go to previous messageGo to next message
Johan Messchendorp is currently offline  Johan Messchendorp
Messages: 693
Registered: April 2007
Location: University of Groningen
first-grade participant

From: *KVI.nl
Hi all,

Once we have the most recent external packages installed, I will upload the most recent pandaroot revision including the macros which are proposed for the DC. This I can do during the weekend. I will also try to initiate a remote installation to all the other sites and report back where things went wrong (using the template on the wiki site).

Kind wishes,

Johan.
Re: one more week left [message #6356 is a reply to message #6346] Thu, 10 April 2008 18:32 Go to previous messageGo to next message
Jens Sören Lange is currently offline  Jens Sören Lange
Messages: 193
Registered: June 2005
first-grade participant
From: *physik.uni-giessen.de
Hi Paul and all,

Quote:

* Packages:
> Can you Soeren, please summarize the status of the software.
> What is the current status, what is left to do?


I can only comment on PandaRoot itself.

There were only 2 issues left.

1.) if riemannfit is on, then lhetrack cannot run.
This is more complicated and will need time to solve.
However, lhetrack is global (has tpc and mvd),
riemannfit is local (tpc only) in our understanding
So we decided to switch riemannfit off for the DC
(I am sorry to Sebastian and Tobias for this,
but we just have to make a decision just due to time pressure).

2.) include mvd into the tpc macros
(and so run the _global_ tracking for both mvd and tpc).
This was solved today by Stefano with help of Tobias and Ralf.
He checked it in.

-> this means:

rev---- is final for tpc, stt, dpm.

As the UrQMD code is not in the svn repository, this means:

rev2480 can be regarded as the PandaRoot version for the DC
(note:revised by Soeren on Friday Apr 11 15:37)


(only exception: if there is a last minute bug fix)

THIS MEANS:

(please read and enjoy)

we will run global(!) tracking for tpc digis(!) and mvd digis(!)

*
Quote:

Macros:
The latest proposal by Soeren was (or did I miss something?)
[...]
Soeren, do you want to update?


Yes, I would like to propose the beam momenta for the dpm simulations.
(these are input parameters for the bash scripts prepared by Johan).

[p_beam / GeV/c][what for?]

0.739 (for PhiPhi at 2.000 GeV)
2.202 (for PhiPhi at 2.500 GeV)
4.064 (for J/Psi 3.096 GeV)
6.234 (for Psi' 3.686 GeV)
6.571 (for Psi 3.770 GeV)
6.991 (for X 3.872 GeV)
7.277 (for Y 3.940 GeV)
7.705 (for Psi 4.040 GeV)
8.685 (for Y 4.260 GeV)
11.917 (for D*_sJ D_s 4.9178 GeV)
15.000 (for Drell-Yan)

I propose 10,000,000 events for each.

Quote:

Could you, Soeren and Johan, please try to make a concise list of possible simulations to run, based on the packages which will be available for the DC01?


Paul, can you specify your question a bit?

Because all the mentioned above is the list.

1. tpc sim and reco (see macros, nEvents, pT above)
2. stt sim and reco (see macros, nEvents, pT above)
3. dpm event generation
(bash scripts by Johan, see p_beam, nEvents just a few lines above)
4. UrQMD event generation
(see separate posting by Johan)

This is it.

Quote:

* Storage:
One event is 250 Bytes (the number was given by Soeren). Is this true for any of the prposed simulations (UrQMD?)? If yes then we most probably have far enough storage capacity (1TB ~ 4.E9 events).


Paul, please be careful!!!

for stt and tpc sim and reco, it is much more!!!

It is the numbers in my last posting.

for tpc 56.5 kB=kiloBytes per event

for stt 17.3 kB=kiloBytes per event

plus an offset for each run (the "zero" events case).

cheers,

Soeren

[Updated on: Fri, 11 April 2008 15:40]

Report message to a moderator

Re: one more week left [message #6361 is a reply to message #6356] Fri, 11 April 2008 09:50 Go to previous messageGo to next message
Paul Buehler is currently offline  Paul Buehler
Messages: 11
Registered: October 2007
occasional visitor
From: *smi.oeaw.ac.at
Many thanks for the update!!

Paul
icon4.gif  if you can wait, please wait a few hours ... [message #6364 is a reply to message #6361] Fri, 11 April 2008 12:05 Go to previous messageGo to next message
Jens Sören Lange is currently offline  Jens Sören Lange
Messages: 193
Registered: June 2005
first-grade participant
From: *physik.uni-giessen.de
Hi everyone,

just to let everyone know ...

I know we fixed the PandaRoot revision, but if there is a chance to wait for a few hours, please do so ...

Mohammad and Stefano are working on some more things.

The idea is to switch on a few more detectors (incl. EMC reco -> clusters for later track matching for both tpc and stt) and Mohammad is just writing a interface to call dpm from a root macro (so that the bash scripts are not needed). We also discussed to run dpm full sim (Geant3, Geant4) instead of only dpm event generation.

So, -> a few nice things!

Anyway, if there is any chance that you wait for a few hours, this would be great ...

cheers, Soeren
Re: if you can wait, please wait a few hours ... [message #6370 is a reply to message #6364] Fri, 11 April 2008 14:12 Go to previous messageGo to next message
Florian Uhlig is currently offline  Florian Uhlig
Messages: 424
Registered: May 2007
first-grade participant
From: *gsi.de
Hi everyone

I installed the latest version of the external packages which is called panda_extern::apr08. This is the version from march with some small changes for the grid. Installation at GSI works fine, so i didn't expect problems with the other sites.

I will prepare everything for the installation of pandaroot.

Ciao

Florian
final PandaRoot for DC [message #6378 is a reply to message #6370] Fri, 11 April 2008 15:50 Go to previous messageGo to next message
Jens Sören Lange is currently offline  Jens Sören Lange
Messages: 193
Registered: June 2005
first-grade participant
From: *physik.uni-giessen.de
Hi all,

we proudly present

PandaRoot rev2480
-> this will now be the version for the DC.


/macro/dc1/stt

new: contains emc digis (clusters), too
for track matching


note: there is no stt rundigi.C macro anymore
-> it is not integrated into the runreco.C macro
(as it is also for tpc)

/macro/dc1/tpc

new: contains emc digis (clusters), too
for track matching


track visualization is now switched off
-> saves disk space

/macro/dc1/full

THIS IS NOW THE DPM !

(it means, basically no basf script necessary anymore,
but if you want to run the DPMGen executable for some
reason, this also still works).

new: we do not only event generation but even Geant3.

all detectors.

pT and nEvents are now parameters of the macro.

disk space: 105 kB per 1 event

(so not anymore only 250 bytes/event!)

CPU time: ~1 sec per 1 event.

note: there is no option for the seed.
Mohammad says that it is taken from the date.

NOTE! pgenerators/DpmEvtGen needs to be compiled on each site!
(this is not part of the cbuild)
but Florian will prepare it for the packages.

all is tested and works.

additional note:

all macros contain new newest mvd and newest forward emc geometry!

(they were generated with different external packages,
but Stefano fixed it).

Thanks everyone for debugging etc. (Mohammad, Stefano, Ola, Tobias, Florian, ...)

I think the phone companies will be very happy about today's accounting bill.

[Updated on: Fri, 11 April 2008 15:51]

Report message to a moderator

random number generator for DPMgen [message #6385 is a reply to message #6378] Fri, 11 April 2008 17:06 Go to previous messageGo to next message
Johan Messchendorp is currently offline  Johan Messchendorp
Messages: 693
Registered: April 2007
Location: University of Groningen
first-grade participant

From: *KVI.nl
Hi,

One very important issue. How is the random seed for the new DPM interface set?

Will the instruction

gRandom->SetSeed(<my favorite seednumber>);

in the macro do the job for me?

Johan.

[Updated on: Fri, 11 April 2008 17:06]

Report message to a moderator

icon14.gif  Re: Upcoming DC - PandaRoot rev2480 [message #6387 is a reply to message #6203] Fri, 11 April 2008 19:01 Go to previous messageGo to next message
Dan Protopopescu is currently offline  Dan Protopopescu
Messages: 55
Registered: September 2007
Location: Glasgow, UK
continuous participant

From: *physics.gla.ac.uk
Also from me lots of thanks to Soeren and everyone on Soeren's list. Great job!

When everything is final, could we have a list with the package names to be installed for the DC posted on the wiki (as reference for the site admins) ?
http://panda-wiki.gsi.de/cgi-bin/view/Computing/DataChallenge1#Packages

Have a nice weekend.
Re: random number generator for DPMgen [message #6388 is a reply to message #6385] Fri, 11 April 2008 19:44 Go to previous messageGo to next message
Mohammad Al-Turany is currently offline  Mohammad Al-Turany
Messages: 518
Registered: April 2004
Location: GSI, Germany
first-grade participant
From: *dip.t-dialin.net
Hallo Johan,


Quote:

Will the instruction
gRandom->SetSeed(<my favorite seednumber>);
in the macro do the job for me?



Not really, because dpm do not know anything about root and gRandom.

in the PndDpmDirect, the seed is set using:

 Long_t Time = time(NULL);
int a = Time/100000;
seed = Time - a*100000 + a/100000.;
 


which is the same as it was in the executable, except that it is changed for each event, so each call to the next event will set the seed again. It is simple but I think it is enough, with the executable you set the seed once for all events and here you change it for each event! Any way if you think this is not enough we can change it and take the seed for example from gRandom, this should be straight forward!

regards

Mohammad

Re: random number generator for DPMgen [message #6389 is a reply to message #6388] Fri, 11 April 2008 22:20 Go to previous messageGo to next message
Johan Messchendorp is currently offline  Johan Messchendorp
Messages: 693
Registered: April 2007
Location: University of Groningen
first-grade participant

From: *xs4all.nl
Hi Mohammad and all others,

Thanks for your clear answer, Mohammad. I just wonder what the probability is that two (or more) jobs will start a DPM event with the same seed. Twisted Evil The "time(NULL)" instruction gives an output in precision of seconds, right? Hmmm, in that case it could actually be very likely. It might, therefore, be more safe to set it up it via gRandom, which one can more easily control from the "outside". Actually, is this the same for the box generator (as used for the other DC macros)?

Best wishes,

Johan

(ps, really nice work what has been done for this DC. Thanks to everyone involved, really impressive!!!!!)
Re: random number generator for DPMgen [message #6391 is a reply to message #6389] Sat, 12 April 2008 19:46 Go to previous messageGo to next message
Mohammad Al-Turany is currently offline  Mohammad Al-Turany
Messages: 518
Registered: April 2004
Location: GSI, Germany
first-grade participant
From: *dip.t-dialin.net
Hi,

The box generator uses gRandom, the code in PndDpmDirect was taken from the original DPM files! Anyway I change it now (rev. 2487) to use gRandom for setting the seed.

regards

Mohammad

[Updated on: Sat, 12 April 2008 19:47]

Report message to a moderator

Re: random number generator for DPMgen [message #6392 is a reply to message #6391] Sat, 12 April 2008 19:58 Go to previous messageGo to next message
Johan Messchendorp is currently offline  Johan Messchendorp
Messages: 693
Registered: April 2007
Location: University of Groningen
first-grade participant

From: *xs4all.nl
Thanks Mohammad!

I also added a seed input to all the DC1 macros, and set gRandom accordingly to the seed input. This modification has been included in rev.2487. as well.

Johan.
Re: Upcoming DC - Testjobs to be specified and documented [message #6427 is a reply to message #6203] Tue, 15 April 2008 10:41 Go to previous messageGo to next message
Kilian Schwarz is currently offline  Kilian Schwarz
Messages: 91
Registered: June 2004
Location: GSI, Darmstadt
continuous participant
From: *gsi.de
Hi all,

would it be possible to put, preferably to the Wiki, a list of testjobs which are to be submitted by the site admins for various purposes ?
So far only the urqmd testjob is well documented. About everything else the information is not so clear for my feeling.

Cheers,

Kilian
Re: Upcoming DC - Testjobs to be specified and documented [message #6428 is a reply to message #6427] Tue, 15 April 2008 10:45 Go to previous messageGo to next message
Paul Buehler is currently offline  Paul Buehler
Messages: 11
Registered: October 2007
occasional visitor
From: *smi.oeaw.ac.at
ok - I'll add some instructions for running test jobs.

Paul
Re: Upcoming DC - Testjobs to be specified and documented [message #6431 is a reply to message #6427] Tue, 15 April 2008 11:23 Go to previous messageGo to next message
Paul Buehler is currently offline  Paul Buehler
Messages: 11
Registered: October 2007
occasional visitor
From: *smi.oeaw.ac.at
Dear all,

I added a section 'Running test jobs' on the wiki.

Paul
Run list for DC1 open for discussion [message #6471 is a reply to message #6431] Wed, 16 April 2008 16:51 Go to previous messageGo to next message
Paul Buehler is currently offline  Paul Buehler
Messages: 11
Registered: October 2007
occasional visitor
From: *smi.oeaw.ac.at
Dear all,

I put a list of jobs to run during the DC on the wiki http://panda-wiki.gsi.de/cgi-bin/view/Computing/DataChallenge1 and would like to ask you for your comments and suggestions.

The list contains all the proposed scripts and momenta. However it foresees to produce only 100'000 events per momentum and not the 10'000'000 as originally proposed.

There are:

- 240 stt splits a 10'000 events, each running for ~2h, producing 200 GB in total
- 240 tpc splits, 10'000, ~2h, ~200 GB
- 110 full splits, 10'000, ~9h, ~200 GB
- 100 urqmd splits, 10'000, ~1h, ~1 GB
- 1000 generic splits, -, ~10 sec, 10 MB

In total: ~2000h, 1 TB

The critical point I think will be the available CPU time and data transfer time.
Alternatively we could run only a few momenta with higher statistics.

Paul


Re: Run list and transfer bandwidth [message #6474 is a reply to message #6471] Wed, 16 April 2008 16:55 Go to previous messageGo to next message
Kilian Schwarz is currently offline  Kilian Schwarz
Messages: 91
Registered: June 2004
Location: GSI, Darmstadt
continuous participant
From: *dip0.t-ipconnect.de
Hi Paul,

great !!!
Can we make a guess concerning the needed bandwidth for inter site transfers ?
I mean, we have a certain amount of promised CPUs at a site.
This results in a certain amount of jobs and finally in a certain amount of produced output per time unit which needs to be transferred to certain places within a given time which results in a need bandwidth of X Mbs between sites.

Cheers,

Kilian
Re: Run list and transfer bandwidth [message #6475 is a reply to message #6474] Wed, 16 April 2008 16:59 Go to previous messageGo to next message
Dan Protopopescu is currently offline  Dan Protopopescu
Messages: 55
Registered: September 2007
Location: Glasgow, UK
continuous participant

From: *physics.gla.ac.uk
I propose Kilian calculates that. Laughing
Re: Run list and transfer bandwidth [message #6477 is a reply to message #6475] Wed, 16 April 2008 17:16 Go to previous messageGo to next message
Kilian Schwarz is currently offline  Kilian Schwarz
Messages: 91
Registered: June 2004
Location: GSI, Darmstadt
continuous participant
From: *dip0.t-ipconnect.de
and I thought Dan would do that ...

Cheers,

Kilian
Start of DC [message #6478 is a reply to message #6475] Wed, 16 April 2008 17:18 Go to previous messageGo to next message
Paul Buehler is currently offline  Paul Buehler
Messages: 11
Registered: October 2007
occasional visitor
From: *smi.oeaw.ac.at
Dear all,

This is to inform you, that I plan to start the job submission for the DC, on April 17, at 08:00 AM Vienna time.

Please do not interfere and submit jobs after that. I will announce the end of the DC, hopefully not later than on April 21.

I'll try to regularly update the wiki http://panda-wiki.gsi.de/cgi-bin/viewauth/Computing/DataChallenge1 with some status reports to keep you informed.

Cheers,

Paul
Re: Start of DC - missing executables [message #6483 is a reply to message #6478] Thu, 17 April 2008 06:40 Go to previous messageGo to next message
Kilian Schwarz is currently offline  Kilian Schwarz
Messages: 91
Registered: June 2004
Location: GSI, Darmstadt
continuous participant
From: *gsi.de
Hi Paul,

it seems as if the executables for the DC1 have not been registered correctly.
I get:
<307] /panda/user/p/pbarprod/bin/ > whereis run_stt_dc1.sh
Apr 17 06:38:47 info The file bin/run_stt_dc1.sh is in
<307] /panda/user/p/pbarprod/bin/ > whereis run_tpc_dc1.sh
Apr 17 06:38:54 info The file bin/run_tpc_dc1.sh is in
<307] /panda/user/p/pbarprod/bin/ > whereis run_urqmd.sh
Apr 17 06:39:01 info The file bin/run_urqmd.sh is in

but for example:
<307] /panda/user/p/pbarprod/bin/ > whereis sim_ana_chain.sh
Apr 17 06:39:44 info The file bin/sim_ana_chain.sh is in
SE => PANDA::Glasgow::raid0 pfn => root://panda.gla.ac.uk:8222//raid/SEData/02/20001/f84537b8-733f-11dc-865 c-00e081408d5a.1191587949

which is the correct output, I believe.
We should have a storage element assigned to the files.

Cheers,

Kilian
Re: Start of DC - missing executables [message #6485 is a reply to message #6483] Thu, 17 April 2008 07:40 Go to previous messageGo to next message
Paul Buehler is currently offline  Paul Buehler
Messages: 11
Registered: October 2007
occasional visitor
From: 193.154.130*
Indeed - the executables seem to have disappeared.

Might be, that this is connected with the fact that Glasgow is currently down?

Paul
Re: Start of DC - missing executables [message #6486 is a reply to message #6485] Thu, 17 April 2008 07:48 Go to previous messageGo to next message
Kilian Schwarz is currently offline  Kilian Schwarz
Messages: 91
Registered: June 2004
Location: GSI, Darmstadt
continuous participant
From: *gsi.de
Hi Paul,

but PANDA::Glasgow::raid0

seems to be online

I can also register files to Glasgow.

Can you reregister them ?
e.g. at GSI ?

Please use:
PANDA::GSI::virtual

for the DC.

NOT
Panda::GSI::file

Cheers,

Kilian
new SE at GSI [message #6487 is a reply to message #6486] Thu, 17 April 2008 07:49 Go to previous messageGo to next message
Kilian Schwarz is currently offline  Kilian Schwarz
Messages: 91
Registered: June 2004
Location: GSI, Darmstadt
continuous participant
From: *gsi.de
Dear all,

for the upcoming data challenge:

please use

panda::gsi::virtual

and not

panda::gsi::file

the new SE is an xrootd cluster with significant more storage space than the old NFS mounted disk.

But both SEs are still operational, in case you want to access old files.

Cheers,

Kilian
Re: Start of DC - missing executables [message #6488 is a reply to message #6486] Thu, 17 April 2008 08:11 Go to previous messageGo to next message
Paul Buehler is currently offline  Paul Buehler
Messages: 11
Registered: October 2007
occasional visitor
From: 193.154.130*
I did re-register the scripts. They are now at PANDA::Vienna::file.

Paul
Re: Start of DC - missing executables [message #6489 is a reply to message #6488] Thu, 17 April 2008 08:19 Go to previous messageGo to next message
Kilian Schwarz is currently offline  Kilian Schwarz
Messages: 91
Registered: June 2004
Location: GSI, Darmstadt
continuous participant
From: *gsi.de
thanks Smile

Cheers,

Kilian
DC1 finished [message #6535 is a reply to message #6489] Mon, 21 April 2008 08:27 Go to previous messageGo to next message
Paul Buehler is currently offline  Paul Buehler
Messages: 11
Registered: October 2007
occasional visitor
From: 193.154.130*
Dear colleagues,

The first PANDA grid Data Challenge, which was running since last Thursday morning is now finished. A brief summary of what has been happening in the past few days can be found at http://panda-wiki.gsi.de/cgi-bin/view/Computing/DataChallenge1#Results.

This DC was very helpful. We will have to analyze and evaluate the result. But I am sure, that the experience we gained will help to further improve the procedures, the system, and its operation.

Many thanks to all of you who have helped in the last days to get and keep the system running.

Best regards,

Paul




Re: DC1 finished [message #6550 is a reply to message #6535] Mon, 21 April 2008 13:23 Go to previous messageGo to next message
Kilian Schwarz is currently offline  Kilian Schwarz
Messages: 91
Registered: June 2004
Location: GSI, Darmstadt
continuous participant
From: *gsi.de
Hi Paul,

thanks a lot for the great work you did.
Why did you put the RUNNING distribution in the Wiki ?
If you put the DONE distribution it looks quite different. According to the DONE of last week in ML GSI computed about 50% of the jobs, while the other sites shared the other 50%. Ateneo contributed with almost 6%, which is great with 10 machines.
What was the main reason why the STT jobs did not work ?
Also in the full_dc quite some jobs seem to have failed. What was the main reason here ?
All in all we seem to have close to 50% error rate, which needs to be improved, somehow.

Cheers,

Kilian
Re: DC1 finished [message #6551 is a reply to message #6550] Mon, 21 April 2008 13:33 Go to previous messageGo to next message
Dan Protopopescu is currently offline  Dan Protopopescu
Messages: 55
Registered: September 2007
Location: Glasgow, UK
continuous participant

From: *gla.ac.uk
Dear all,

We are still updating the DC1 wiki. I will add final plots and graphs from MonaLisa shortly. Please watch this for updates.

DC1 Summary [message #6553 is a reply to message #6550] Mon, 21 April 2008 16:21 Go to previous messageGo to next message
Dan Protopopescu is currently offline  Dan Protopopescu
Messages: 55
Registered: September 2007
Location: Glasgow, UK
continuous participant

From: *physics.gla.ac.uk
Dear all,
I updated the wiki with more comments and plots. Please review the updated info at:
http://panda-wiki.gsi.de/cgi-bin/view/Computing/DataChallenge1#Evaluatio n
We will update the summary of the physics data produced asap.
Re: DC1 Summary [message #6557 is a reply to message #6553] Tue, 22 April 2008 08:09 Go to previous message
Kilian Schwarz is currently offline  Kilian Schwarz
Messages: 91
Registered: June 2004
Location: GSI, Darmstadt
continuous participant
From: *gsi.de
great plots and description. Very informative.
I think the whole exercise was a big success !!!

Cheers,

Kilian
Previous Topic: vmplayer and Grid
Next Topic: Please run the MonALISA Toolbar
Goto Forum:
  


Current Time: Sun Oct 06 06:32:47 CEST 2024

Total time taken to generate the page: 0.00858 seconds