CERN

LPC meeting summary 17-10-2016 - final

Minutes overview      LPC home


Minutes and Summary

Main purpose of the meeting: Discussion of possible running scenarios in 2017.

Introduction (Jamie Boyd)

Jamie summarised the running period from the MD4 to now. He noted the not understood problem of the ALICE magnet powering which gives raise to concern. Due to the reversal of the LHCb dipole now the bunch length leveling is re-introduced until the end of the proton period. LHCb will stay in this configuration until the end of the year (including the heavy ion period).

Jamie summarised the special activities during the last running period. In particular he pointed out the high pile-up test fill with high intensity individual bunches and high intensity trains of 48b. The observed pile-up was reported by the ATLAS/CMS to be around 90 for the INDIVs and 50 for the trains. Detailed analysis is ongoing. Jörg pointed out that the emittances of the INDIVs blew up by a factor of 2. CMS stated that the online lumi system might have underestimated the pile-up whereas the ATLAS stated that the online pile-up numbers might have been an overestimation.

In the discussion Jörg pointed out a current problem in the ALICE leveling implementation. The operations team tries to work around the problem for the last days of proton running. In the coming year the leveling will be re-implemented in a new server where the problem will be correctly addressed.

LHC options 2017 (Mike Lamont)

Mike enumerated the various existing options for next years running scenario:

Mike summarised the known limitations of the accelerator complex (relevant for LHC operations):

It remains to be seen if a scrubbing run at the beginning of 2017 will be needed. However if the dipole magnet with the potential short will be replaced during the EYETS, the scrubbing of that sector will be reset. It is not known how much time will be needed to condition this sector to its previous characteristics.

The current schedule v0.5 foresees 152 physics days in 2017.

Mike listed possible machine parameters for nominal and BCMS running in two tables, one assuming β* of 40cm the second β* of 33cm. Values in these tables are only indicative and might change. Emittances in the tables have been scaled up due to the higher intensities.

On a question of Roberto Carlin (CMS) Mike estimated that the 144b limit for the BCMS beam cannot be improved.

The chances that the emittance blow up in the LHC can be improved are estimated to be small (by Mike and Jörg). Jörg stated that the source of the blow up is not known.

In the discussion on the possibly lower β* it was asked if going too much to the limits would reduce the availability. Mike answered that people think to be able to go to 33cm without deteriorating the availability. Jörg stated that β* seems to be easy to optimise, however effects due to triplet movements would increase. Jamie stated that he would prefer a further reduction of the crossing angle instead of lowering β* in order to mitigate the effect on the luminosity difference between ATLAS and CMS due to the geometric factors and the different crossing planes of the two experiments. Mike stated that with the current crossing angle choices are already moderately aggressive. This can also be seen in the higher losses during injection. Sudan asked if the 96bpi are a hard limit in case the SPS beam dump will not be exchanged. The answer was that this limit is not hard.

Leveling options for 2017 (Jörg Wenninger)

Jörg listed the possible leveling options with increasing complexity:

Leveling by separation if the simplest method for leveling factors up to 2 and does not need additional commissioning or MPP validation. Experience in 2016 showed that beams always stayed stable, however this behaviour cannot be predicted for next year since it is a complex interplay of many parameters of the beam. In Run1 one beams have become unstable but the 50ns beam had a much higher brightness than the 25ns beam used in Run 2.
A negative side effect of leveling by separation could be the sensitivity of the luminosity to small variations of the separation induced by noise or triplet movements.

Leveling by crossing angle can be used in range of up to 20% from the peak lumi. It can be used as anti-leveling (i.e the crossing angle can only be reduced to increase the lumi i.e. towards the end of a fill.) In long fills the anti-leveling could gain up to 5% of integrated luminosity in the fill. The scheme is compatible with separation leveling. Crossing angle leveling should be transparent for AFP but not for CTPPS with the current bump. It was noted that the orbit distortions which are necessary to implement crossing angle changes for a given lumi change are much larger than those associated with separation leveling. This is the main reason for increased complexity of crossing angle leveling compared to separation leveling. Since the changes of the orbit during crossing angle leveling are large, the TCTs need to be moved (or adjusted to a preset compatible with a range of different crossing angles) and the orbit must be controlled with the orbit feedback system.

β* leveling is the most complex method requiring essentially to squeeze the beams in Stable Beams. The optics is being changed during the leveling. In case of ATS optics these changes would be even global and would impact ALICE and LHCb. One would have to expect losses during the transitions as they are being observed during the squeeze today. Jörg explained the functioning of the current squeeze which uses perfectly matched intermediate points in the squeeze and the transitions between them during which parameters are interpolated and optics errors appear.
A possible scenario for β* leveling in 2017 would be to have discrete β* values which will be used in Stable Beams and during the transitions the beam mode would be changed.
Finally Jörg remarked that the lumi ratio of ATLAS and CMS might change as a function of β*. In addition it is unknown how the stability of the offset beams in LHCb and ALICE look like during the β* leveling step. Also the beam size at the Roman pots will vary during the β* leveling which needs to be considered.

Jörg recommends to use the leveling by separation for 2017 due to its simplicity and due to the experience of the past years. Mike stated that it cannot be imagined to have a working β* leveling in 2017.

In the discussion it was made clear that the machine would rely in all cases on data from the experiments to adjust the leveling. If BLMs in the triplets are linear in the leveling range they could also be used as feedback.
Roberto wanted to know when the machine experts expect to have more information on the cooling limits of the triplets. Mike answered that very preliminary investigations seem to indicate a significant margin on top of the current cooling limit, and it looks as if this limit could be increased significantly once the ongoing studies are finished.
In the discussion Mike also stated that he can promise that the machine will not exceed 2x1034 peak luminosity until LS2.
Roberto stated that it was difficult for experiments to estimate the effects on the total luminosity in the scenario of leveling. At the beginning of a fill we currently have losses which are 30% higher than the losses due to the burn off. It is not clear how this behaviour is when leveling. But Jörg stated that the non-burn-off related losses seem not to change significantly during leveling.

Roberto asked what the machine experts would do if they need to level because of the triplet cooling but find that the leveling would lead to instabilities. There is no strategy worked out for this scenario yet.

In the concluding remarks, Jamie outlined to process how to come to a decision on the running scenario: For Evian the LPC expects a baseline request of the experiments and the final decision will be taken in Chamonix. It is proposed to come back to this discussion at an LPC meeting in 4 weeks (mid Nov) to converge on the baseline request for Evian. In addition the next LHCC will discuss this topic.