A debate to understand one of the key steps of medicine development.

“Primary endpoints in Oncology Clinical Trials” is the title of a lecture organized by The Italian Medical Society of Great Britain (IMSoGB) at the Italian Cultural Institute in London. This is only one of the many cultural events organised by IMSoGB, to promote new collaborations among doctors and to spread medical knowledge .

This topic could seem just for experts. However, this debate was a chance to learn more about one of the key steps in the medical research and development. The speakers, all members of IMSoGB and all international experts talked from three different perspectives about these research studies.

  1. ·         For the clinician’s point of view,Professor  Riccardo Audisio, MD, FRCS and President of the European Society of Surgical Oncology (ESSO).
  2. ·         For the regulatory Body’s point of view, Prof. Guido Rasi, Former Executive Director of European Medicines Agency.
  3. ·         For the healthcare industry‘s point of view, Dr. Lucio Fumi Head, Medical Affairs, International Oncology, Terumo.

However, what are the clinical trials? What are the primary endpoints? Why they are so important?

Well, let’s see…

Clinical trial’s purpose is data collection to evaluate the effectiveness and safety of medications or medical devices. Clinical trials are used to determine whether new biomedical or behavioural interventions are safe and effective. Furthermore, they are used to compare a new treatment with the standard ones. Therefore, clinical trials are essential in medical research and development, because without them there would never have been any advancement in medicine. 

Clinical trials are usually conducted in four phases. The potential treatments, found interesting in the pre-clinic studies, need to successfully cross the perilous “Funding Valley of Death,” to be tested in the clinical trials. As Dr Fumi says,”the image of the Valley of Death is often used to describe graphically the lack of continuum between development and marketing”.

 

In fact, many promising potential drugs, devices, or other potential treatments are never tested in clinical trials due to lack of funding or other business difficulties. In many cases, further financial support or partnerships are needed to help “bridge the gap” across the “Funding Valley of Death” to begin Phase 1 Clinical Trial.

Endpoints are results, conditions or events associated with individual study patients that are used to assess study treatments. The Primary Endpoint is the single endpoint parameter for rejection of the null hypothesis. ). A trial may also define one or more secondary endpoints. These typically include secondary efficacy measures (additional evaluations designed to assess the clinical effectiveness of the drug in controlling disease) and safety endpoints (designed to measure tolerability and safety of treatment over the period of study.

 

These parameters should be easy to diagnose, free of measurement error and reliable with repeated measure. The endpoint has to be clearly set ahead and requires no subjectivity.

It is very important to say that designing an oncology clinical trial is more difficult than other clinical trials. Firstly, as we know, cancer is a group of very articulate and complex diseases and the therapy can be often a combination of surgery, radiotherapy and chemotherapy, so, the overall therapy changes for every patient.Secondly, one phase of testing a new drug involves acomparison between a group that take the drug and a group taking a placebo.

However, there is the ethical problem of leaving cancer patients without treatment.Another important point is the difference of the regulatory body in EU and USA. Often, FDA uses as primary endpoint the Overall Survival (OS).

Overall Survival is the percentage of patients alive at a defined period of time after diagnosis or, in treatment studies, the percentage of patients alive at a defined time after initiation of the treatment. OS is often reported as a five-year survival rate, i.e. percentage of patients alive five years after diagnosis or treatment.

 The most interesting point of this lecture is that all three speakers agree about the need to use different primary endpoints for the oncology clinical trials.

I had the great pleasure to interview Dr. Fumi, and as he said, using alternative primary endpoints in oncology clinical trials can be very useful to the development of new treatments. It is also important to use short-term endpoints in the oncology clinical trials, because this is the best way to assess the treatment step by step, he added.

Talking about alternative primary endpoints, one that is used in the UK is QALY (quality-adjusted life year or quality-adjusted life-year) that is a measure of disease burden, including both the quality and the quantity of life lived.

This is a very engaging primary endpoint because the quality of life is one of the cancer patients’ major concerns. In addition, Quality of Life and Tumour response have to be taken into consideration more than in the past.

Designing primary endpoints is one of the most challenging issues related to the clinical trials. In fact, the wrong choice can delay the development of new drug or invalidate a study. For this reason, the choice of clinical efficacy endpoints remains a controversial topic.

To conclude, it is beyond question, that a key to design a successful clinical trial is the professional partnership of the doctors with different backgrounds and specialisms. The collaboration is essential to cover all the aspects of these research studies.

The clinic, the regulators and the pharmaceutical point of view are all equally important to guarantee the development of the best treatment.

The balance between benefits and risks has to be considered, because, it occupies a central place in licensing and approval decisions, Prof Rasi said. The priority, as they said, is always to identify and promote a safe, effective patient-centred cancer care. 

ICARUS neutrino experiment to move to Fermilab

Geneva, 22 April 2015. A group of scientists led by Nobel laureate Carlo Rubbia will transport the world’s largest liquid-argon neutrino detector across the Atlantic Ocean from CERN to its new home at the U.S. Department of Energy1’s Fermi National Accelerator Laboratory2.

The 760-ton, 20-metre-long detector took data for the ICARUS experiment at the Italian Institute for Nuclear Physics3′ (INFN) Gran Sasso National Laboratory in Italy from 2010 to 2014, using a beam of neutrinos sent through the earth from CERN. The detector is now being refurbished at CERN, where it is the first beneficiary of a new test facility for neutrino detectors.

When it arrives at Fermilab, the detector will become part of an on-site suite of three experiments dedicated to studying neutrinos, ghostly particles that are all around us but have given up few of their secrets.

All three detectors will be filled with liquid argon that enables the use of state-of-the-art time projection technology, drawing charged particles created in neutrino interactions toward planes of fine wires that can capture a 3-D image of the tracks those particles leave. Each detector will contribute different yet complementary results to the hunt for a fourth type of neutrino.

“The liquid-argon time projection chamber is a new and very promising technology that we originally developed in the ICARUS collaboration from an initial table-top experiment all the way to a large neutrino detector,” Rubbia said. “It is expected that it will become the leading technology for large liquid-argon detectors, with its ability to record ionizing tracks with millimetre precision.”

Fermilab operates two powerful neutrino beams and is in the process of developing a third, making it the perfect place for the ICARUS detector to continue its scientific exploration. Scientists plan to transport the detector to the United States in 2017.

A planned sequence of three liquid-argon detectors will provide new insights into the three known types of neutrinos and seek a yet unseen fourth type, following hints from other experiments over the past two decades.

Many theories in particle physics predict the existence of a so-called “sterile” neutrino, which would behave differently from the three known types and, if it exists, could provide a route to understanding the mysterious dark matter that makes up 25 percent of the universe. Discovering this fourth type of neutrino would revolutionize physics, changing scientists’ entire picture of the universe and how it works.

“The arrival of ICARUS and the construction of this on-site research programme is a lofty goal in itself,” said Fermilab Director Nigel Lockyer. “But it is also the first step forward in Fermilab’s plan to host a truly international neutrino facility, with the help of our partners from around the world. The future of neutrino research in the United States is bright.”

Fermilab’s proposed suite of experiments includes a new 260-ton Short Baseline Neutrino Detector (SBND), which will sit closest to the source of the particle beam. This detector is under construction by a team of scientists and engineers from universities and national laboratories in the United States and Europe.

The neutrino beam will then encounter the already-completed 170-ton MicroBooNE detector, which will begin operation next year. The final piece is the ICARUS detector, which will be housed in a new building to be constructed on site.

Construction on the ICARUS and SBND buildings is scheduled to begin later this year, and the three experiments should all be operational by 2018. The three collaborations include scientists from 45 institutions in six countries.

The move of the ICARUS detector is a sterling example of cooperation between countries (and between three scientific collaborations) to achieve a global physics goal. The current European strategy for particle physics, adopted by the CERN Council, recommends that Europe play an active part in neutrino experiments in other parts of the world, rather than carry them out at CERN.

The U.S. particle physics community has adopted the P5 (Particle Physics Project Prioritization Panel) plan, which calls for a world-class long-distance neutrino facility to be built at Fermilab and operated by an international collaboration. Fermilab, CERN, INFN and many other international institutions are expected to be partners in this endeavour.

Knowledge gained by operating the suite of three liquid-argon experiments will be important in the development of the DUNE experiment at the planned long-distance facility at Fermilab. DUNE will be the largest neutrino oscillation experiment ever built, sending particles 800 miles from Fermilab to a 40,000-ton liquid-argon detector at the Sanford Underground Research Lab in South Dakota.

The journey of ICARUS from Italy to CERN to the U.S. is a great example of the global planning in particle physics,” said CERN Director General Rolf Heuer. “U.S. participation in the LHC and European participation in Fermilab’s neutrino programme are integral parts of both European and U.S. strategies.

I am pleased that CERN has been able to provide the glue that is allowing DUNE to get off the ground with the transport of ICARUS.”

“The ICARUS T600 is the only detector in the world with more than 600 tons of argon to have been successfully operated,” said INFN’s deputy president Antonio Masiero. “ICARUS uses a high-precision, innovative technique to detect neutrinos artificially produced in an accelerator. This technique, developed at the INFN and first successfully put into operation in the ICARUS experiment at the INFN’s Gran Sasso National Laboratory, will make in the new dedicated facility at Fermilab a fundamental contribution to neutrino research.”

Proton beams are back in the LHC

Geneva -After two years of intense maintenance and consolidation, and several months of preparation for restart, the Large Hadron Collider, the most powerful particle accelerator in the world, is back in operation.

Today at 10.41am, a proton beam was back in the 27-kilometer ring, followed at 12.27pm by a second beam rotating in the opposite direction.

These beams circulated at their injection energy of 450 GeV. Over the coming days, operators will check all systems before increasing energy of the beams.

 

“Operating accelerators for the benefit of the physics community is what CERN’s here for,” said CERN Director General Rolf Heuer. “Today, CERN’s heart beats once more to the rhythm of the LHC.”

“The return of beams to the LHC rewards a lot of intense, hard work from many teams of people,” said Head of CERN’s Beam Department, Paul Collier. “It’s very satisfying for our operators to be back in the driver’s seat, with what’s effectively a new accelerator to bring on-stream, carefully, step by step.”

The technical stop of the LHC was a Herculean task. Some 10,000 electrical interconnections between the magnets were consolidated.

Magnet protection systems were added, while cryogenic, vacuum and electronics were improved and strengthened.

Furthermore, the beams will be set up in such a way that they will produce more collisions by bunching protons closer together, with the time separating bunches being reduced from 50 nanoseconds to 25 nanoseconds.

“After two years of effort, the LHC is in great shape,” said CERN Director for Accelerators and Technology, Frédérick Bordry. “But the most important step is still to come when we increase the energy of the beams to new record levels.”

The LHC is entering its second season of operation. Thanks to the work done in the last two years, it will operate at unprecedented energy – almost double that of season 1 – at 6.5 TeV per beam. With 13 TeV proton-proton collisions expected before summer, the LHC experiments will soon be exploring uncharted territory.

The Brout-Englert-Higgs mechanism, dark matter, antimatter and quark-gluon plasma are all on the menu for LHC season 2.

After the discovery of the Higgs boson in 2012 by the ATLAS and CMS collaborations, physicists will be putting the Standard Model of particle physics to its most stringent test yet, searching for new physics beyond this well-established theory describing particles and their interactions.


LHC restart update

LHC restart update

LHC run 2 is coming ever closer. Seven of the machine’s eight sectors have succe… READ MORE

Centenary of Einstein’s theory of General Relativity

In 1915, the theory of General Relativity developed by Einstein showed how light was at the center of the very structure of space and time.

There will be many events worldwide focusing on this seminal theory of the universe, and this page will provide specific links so you can get involved, and will also provide other resources so that you can learn about Einstein and his many contributions to physics and cosmology.

2015 marks an important milestone in the history of physics: one hundred years ago, in November 1915, Albert Einstein wrote down the famous field equations of General Relativity. General Relativity is the theory that explains all gravitational phenomena we know (falling apples, orbiting planets, escaping galaxies…) and it survived one century of continuous tests of its validity.

After 100 years it should be considered by now a classic textbook theory, but General Relativity remains young in spirit: its central idea, the fact that space and time are dynamical and influenced by the presence of matter, is still mind-boggling and difficult to accept as a well-tested fact of life¹.

 

The development of the theory was driven by experiments that took place mostly in Einstein’s brain (that is, so-called “thought experiments”).

These experiments centred on the concept of light: “What happens if light is observed by an observer in motion?” “What happens if light travels in the presence of a gravitational field?” Naturally, several tests of General Relativity have to do with light too: the first success of the theory and the one that made the theory known to the whole world, was the observation of the light deflection by the Sun.

Eddington in 1919 was able to observe, during an eclipse, the effect of the Sun on the light coming from a far away star.

The observed deflection was in perfect agreement with Einstein’s theory while the prediction of the old theory of Newton was off by a factor of 2: a triumph for Einstein! Nowadays, light deflection by astrophysical objects (that is optics with very massive lenses!) is a tool successfully used to explore the Universe: it is called gravitational lensing.

 

Light remained central even in subsequent tests of the theory. For example in the so-called gravitational redshift: light changes frequency when it moves in a gravitational field, another predictions of General Relativity, experimentally tested since 1959.

Actually, the happy marriage between light and General Relativity is important every time we use a GPS device: general relativistic effects are crucial to determine our position with the required accuracy!

 

But the most amazing prediction of General Relativity has not to do with light, but rather with its absence! Black holes are objects so dense that even light cannot escape their strong gravitational field!.

Again it is not science fiction: black holes are by now standard objects that we (indirectly!) observe and study.

 

On much larger, cosmological scales, the gravitational redshift of light from galaxies and exploding stars (supernovae) constitutes the basic tool that allows us to “map” the Universe and study its “geometry”.

It is through these tools that we realized that the Universe is expanding, i.e. all Galaxies are moving away from each other.

Even more recently it became clear that this expansion is in fact accelerating! As a consequence we realized that there is a new form of (dark) energy present in our Universe! It is worth noting that all these amazing and surprising discoveries were made possible by studying the light coming from distant astrophysical events in the framework of General relativity.

 

LHC experiments join forces to zoom in on the Higgs boson

Geneva,  March 2015. During the 50th session of “Rencontres de Moriond” in La Thuile Italy, ATLAS and CMS presented for the first time a combination of their results on the mass of the Higgs boson. The combined mass of the Higgs boson is mH = 125.09 ± 0.24 (0.21 stat. ± 0.11 syst.) GeV, which corresponds to a measurement precision of better than 0.2%.

The Higgs boson is an essential ingredient of the Standard Model of particle physics, the theory that describes all known elementary particles and their interactions.

The Brout-Englert-Higgs mechanism, through which the existence of the Higgs boson was predicted, is believed to give mass to all elementary particles. It is the most precise measurement of the Higgs boson mass yet and among the most precise measurements performed at the LHC to date.

 

“Collaboration is really part of our organization’s DNA”, said CERN Director General Rolf Heuer. “I’m delighted to see so many brilliant physicists from ATLAS and CMS joining forces for the very first time to obtain this important measurement at the LHC.”

The Higgs boson decays into various different particles. For this measurement, results on the two decay channels that best reveal the mass of the Higgs boson have been combined (Higgs boson decaying to two photons and to 4 leptons, leptons being electron or muon here).

Each experiment has found a few hundred events in the Higgs to photons channel and a few tens in the Higgs to leptons channel, using the data collected at the LHC in 2011 and 2012 at centre-of-mass energies of 7 and 8 TeV, having examined about 4000 trillion proton-proton collisions.

The two collaborations worked together and reviewed the analyses and their combination. Experts of the analyses and of the different parts of the detectors that play a major role in this measurement were closely involved.

“The Higgs Boson was discovered at the LHC in 2012 and the study of its properties has just begun.

By sharing efforts between ATLAS and CMS, we are going to understand this fascinating particle in more detail and study its behaviour”, said CMS spokesperson Tiziano Camporesi.

“CMS and ATLAS use different detector technologies and different detailed analyses to determine the Higgs mass. The measurements made by the experiments are quite consistent, and we have learnt a lot by working together, which stands us in good stead for further combinations.”, said ATLAS spokesperson Dave Charlton.

The Standard Model does not predict the mass of the Higgs boson itself and therefore it must be measured experimentally.

However, once supplied with a Higgs mass, the Standard Model does make predictions for all the other properties of the Higgs boson, which can then be tested by the experiments.

This mass combination represents the first step towards a combination of other measurements of Higgs boson properties, which will involve also the other decays.

“While we are just getting ready to restart the LHC, it is admirable to notice the precision already achieved by the two experiments and the compatibility of their results. This is very promising for LHC Run 2”, said CERN Director of Research Sergio Bertolucci.

This result was achieved by bringing together physicists of the ATLAS and CMS collaborations, representing together more than 5,000 scientists from over 50 different countries.

Up to now, increasingly precise measurements from the two experiments have established that all observed properties of the Higgs boson, including its spin, parity and interactions with other particles are consistent with the Standard Model Higgs boson.

With the upcoming combination of other Run 1 Higgs results from the two experiments and with higher energy and more collisions to come during LHC Run 2, physicists expect to increase even more the precision of the Higgs boson mass and explore in more detail the particle’s properties.

During Run 2, they will be able to combine their results promptly and thus increase the LHC’s sensitivity to effects that could hint at new physics beyond the Standard Model.

Physicist explains black hole that should not exist

Physicist explains black hole that should not exist

In his weekly blog, Science Seen, Australian-Canadian physicist and author of Time One: Discover How the Universe Began Colin Gillespie says a newly discovered monster black hole can be explained even though current theories of black-hole formation can’t account for its existence.

The problem is the newfound black hole was already bigger than ten billion Suns when the first stars and galaxies had just come into being. Gillespie says that, contrary to conventional theory, big black holes predate the stars and galaxies. He says a new kind of cosmology makes this possible.

Current theories portray an analog universe. That is, their space and time are said to be continuous. My recent book Time One explores a quantum universe (a concept that goes back at least as far as Albert Einstein), one that begins with a single Planck-sized quantum (or fleck) of space that starts to replicate with the first tock of a Planck-time-quantum clock. This quantum origin helps to explain many observations that seemed inexplicable.

Analog theories of space and time fail to explain how such a huge black hole could grow in time for us to see it 12.8 billion years ago. But Gillespie says its presence at that time is easily explained in terms of Time One’s quantum cosmology. You would expect to see big black holes right after the beginning of a quantum universe.

 

 note: it is new discover? Or a new business  discorver for to sell the book?…. the answer is in your reading….

Dr Fabiola Gianotti New CERN director

CERN Council selected the Italian physicist, Dr Fabiola Gianotti, as the Organization’s next Director-General. The appointment will be formalised at the December session of Council, and Dr Gianotti’s mandate will begin on 1 January 2016 and run for a period of five years. Council rapidly converged in favour of Dr Gianotti.

“We were extremely impressed with all three candidates put forward by the search committee,” said President of Council Agnieszka Zalewska. “It was Dr Gianotti’s vision for CERN’s future as a world leading accelerator laboratory, coupled with her in-depth knowledge of both CERN and the field of experimental particle physics that led us to this outcome. I would like to thank all the candidates for giving Council such a hard decision to make, and the search committee for all its hard work over recent months.”

 

“Fabiola Gianotti is an excellent choice to be my successor,” said CERN Director General Rolf Heuer. “It has been a pleasure to work with her for many years. I look forward to continuing to work with her through the transition year of 2015, and am confident that CERN will be in very good hands.”

“It is a great honour and responsibility for me to be selected as the next CERN Director-General following 15 outstanding predecessors,” said Dr Gianotti. “CERN is a centre of scientific excellence, and a source of pride and inspiration for physicists from all over the world. CERN is also a cradle for technology and innovation, a fount of knowledge and education, and a shining, concrete example of worldwide scientific cooperation and peace. It is the combination of these four assets that renders CERN so unique, a place that makes better scientists and better people. I will fully engage myself to maintain CERN’s excellence in all its attributes, with the help of everybody, including CERN Council, staff and users from all over the world.”

 

Dr Gianotti was leader of the ATLAS experiment collaboration from March 2009 to February 2013, covering the period in which the LHC experiments ATLAS and CMS announced the long-awaited discovery of the so-called Higgs boson, recognised by the award of the Nobel Prize to François Englert and Peter Higgs in 2013. She is a member of many international committees, and has received many prestigious awards. She will be the first woman to hold the position of CERN Director-General.

A press conference will be held at CERN’s Globe of Science Innovation this afternoon at 15:00, at which Dr Gianotti will be joined by the President of CERN Council and the CERN Director-General.

Rosetta rendezvous with a comet

At the Royal Societ’s Summer Scienze Exhibition in London, there is a great important show about Rosetta.

The Esa Rosetta mission is the first mission in history to rendezvous with a comet, escort it as it orbits the Sun, and deploy a lander. During its 10 year journey towards comet 67P/Churyumov-Gerasimenko, the spacecraft has passed by two asteroids: 2867 Steins (in 2008) and 21 Lutetia (in 2010).
In Order for the Rosetta craft to reach the comet it was launched from earth over 10 years ago. its flight path through the solar sistem was carefully modelled and planned before launch. he spacecraft entered deep-space hibernation mode in June 2011, and ‘woke up’ on 20 January 2014. Rosetta will arrive at the comet in August 2014, and deploy the Philae lander in November 2014.ESA’s Rosetta spacecraft has found that comet 67P/Churyumov–Gerasimenko is releasing the equivalent of two small glasses of water into space every second, even at a cold 583 million kilometres from the Sun.
To escape from gravity af an obiect ,such as a planet or a moon, you need think must reach a certain speed to avercome the gravity pulling you back. This speed is know as the escape velocity.This velocity for an object depend upon its mass and its size. If we assume the object is spherical the escape velocity (Ve)is give by the equation:

Ve=√2GM/r

rosettawhere G is the Gravitational constant, r is the radius of the object in metres and M is the mass in kilogram.

an curiusity, just ad astronaut are able to jump much higher wher they are on the moon compared with when they are on Earth, so too the escape velocity fro the moon is much lower than that of the Earth. The escape velocity frm the Earth is 11.2×1000 m/s for the moon is 2.4×1000 m/s.In both cases the speed is really high and so it isn’t at all easy to reach escape velocity. This concepts are at the basis of every mission and are also used to Rosetta.

A rendezvous with a comet and land on, to study to understand where we come from, it like a fantasy or ,to read a Litle Prince’s page, but it not, it reality.
‘In November, Rosetta will land’ on the comet 67P/Churyumov -Gerasimenko and we have a prove and new exciting facts about our lives and our universe.

‘The home of human ingenuity’

Museum of Science, London: our point of view

The first sentence that visitors come across when entering the Museum of Science is: “Welcome to the home of human ingenuity”, surrounded by pictures representing the latest scientific discoveries. From the very beginning the building appears to be very modern and undoubtedly huge.

The latest discoveries about space and the atmosphere – along with spaceship models -, exhibitions, agriculture, culture, cosmos, mathematics, physics, computing and energy are only some of the subjects showcased in the museum. And for the fans of aviation there is an entire floor dedicated to flight, with original planes that can be admired and compared. Not least, there are two 360° plane simulators, which seem to be an unusual, though welcomed, novelty for a museum. At every turn there is an opportunity for visitors, especially young people, to use practical scientific instruments and test their ability on a variety of subjects.

Established in 1857, The Museum of Science is now one of the most esteemed museums in London and the UK.
“From 3 to 4 thousand people come daily to visit the museums during the week, while on Sundays and holidays this number grows up to 10 thousand” says a member of the museum’s staff. What is certain is that the Museum of Science is 5th most visited museum in the UK, with roughly 3 million visitors coming every year, according to ALVA statistics.
“Not everyone comes to the museum knowing what they are going to encounter, whereas others deliberately come to see key things at the museum, such as the DNA model, the ‘Babbage’ (the first mechanical computer), and the means of transport – which are well preserved in the museum” concluded another member of staff.
To sum up, the museum entirely fulfilled our expectations, as it clearly showcased the real and practical results of human ingenuity.

Brain and electricity: find the connection!

Scienze Museum London –  Have you ever thought that moving is caused by a electrical purse? If you visit the “Mind Maps: story from psychology” exhibition at Science Museum within 26th October, you will satisfy your curiosity about this field. Here in fact you can find a lot of tools and prompters which explains experiments and history.

Let’s start from the doctor who consider the electricity power on the human body at first: Luigi Galvani (1737-98), one of the most important Italian physician and philosopher. He was convinced that the neves are capable to control every movements and his suspect was confirmed during the dissection of a frog in 1790. After about fifty years, in 1860, Emil Du Bois Reymond, a German physician, invented a frog pistol in order to demonstrate the Galvani’s discoveries concretely.

The last but not the least experiment consists in the combination of voices and objects. This treatment was used with patients who listened weird voices in their minds and, by the choice of a particular object, the doctor was able to understand the trauma nature.