2018 Marie Curie Individual Fellowship (H2020-MSCA-IF-2018)

p3dr0
Posts: 3
Joined: Tue Feb 12, 2019 10:49 am

Re: 2018 Marie Curie Individual Fellowship (H2020-MSCA-IF-2018)

Post by p3dr0 » Wed Feb 13, 2019 12:56 pm

CountZ wrote:
Wed Feb 13, 2019 10:49 am
Hi again everyone

We are writing a complaint to the Commission about unfair evaluation processes, but we need your help.
If you think that the evaluation is unfair, please tell us why. Maybe you got a much lower score on re-submission. Maybe you don't think your proposal should have received the high/low score it did. The goal is the get the Commission to rethink its evaluation process, so as to make this less of a lottery, and instead to reward the best proposals.
My two cents on this.

You cannot fight against unfair (or perceived as unfair, which is a very important nuance) reviews with limited review time and multiple reviewers. If you believe it was unfair, as stated by the EC "you may request an ‘evaluation review’ on the procedural aspects of the evaluation (not the merits of your proposal)". I did it for what I considered being an unfair review two years ago, it did not change anything. Unfair, yes, but that's the name of the game.

What is quite different, in my opinion, is getting a much lower score on re-submission. Here it shows that basically the full evaluation process is flawed, and that the full thing is a lottery (actually, some authors have suggested using a partial lottery system, see eg Gross & Bergstrom 2019 PLOS Biology).

If you want to follow the op-ed route, the second phenomenon appears more problematic to me (differences of more than 20% in the final mark from the same starting material???). The first one is trivial: of course you may get conflicting reviews depending on the level of expertise of the reviewers within the panel. Still, if you want to discuss this "unfairness", is the number of evaluation reviews asked per year available? This could be a good indicator. Another thing to keep in mind: for one unfair review, how many fair ones? And which ratio is acceptable?

AdinaBabesh
Posts: 165
Joined: Wed Jan 23, 2019 4:24 pm

Re: 2018 Marie Curie Individual Fellowship (H2020-MSCA-IF-2018)

Post by AdinaBabesh » Wed Feb 13, 2019 1:10 pm

I also want to contribute with my proposal, I will send a private message soon with the requested information.
Giu83 wrote:
Wed Feb 13, 2019 11:58 am
CountZ wrote:
Wed Feb 13, 2019 10:49 am
Hi again everyone

We are writing a complaint to the Commission about unfair evaluation processes, but we need your help.
If you think that the evaluation is unfair, please tell us why. Maybe you got a much lower score on re-submission. Maybe you don't think your proposal should have received the high/low score it did. The goal is the get the Commission to rethink its evaluation process, so as to make this less of a lottery, and instead to reward the best proposals. You can quote this message and fill in the following:

Proposal acronym:
Proposal ID:
Your field/area:
Reason you believe the system is unfair:
Suggestion for improving it:

Please also Private Message your email address to me or to Dajm, if you wish to: (1) be recontacted about our complaint, (2) have an opportunity to sign the complaint before we send it.

As I said in a previous message, my proposal is a resubmission scored 89,6 in 2017. After reviewing my project according to the weaknessess raised my project has received a score of 71,2. None of the previous weaknessess has been evidenced by reviewers ( that means I have addressed all the criticisms) but, although the project was exactly the same, it was rejected due to a serious number of inherent weaknesses. In the last paragraph of the evaluation report reviewers state that " Over the years proposal are usually assessed by different evaluators who may express different judgements and opinions.....This may lead to a difference in scoring results and opinions". Of course this makes sense, but not such a huge drop as in my case ( from 89, 6 to 71 is 20%less than the year before). Furthermore, I also found some inconsistences between strenghts and weaknesses, as follows (just some examples):

Excellence
Strenghts

The researcher will benefit from the host institute’s experience in MRI technique and research in psychiatric disorders as well as from the
host institute's international network, and gain new skills in supervising.

The researcher has a good track record of successful research, publication, teaching activities and previously acquired skills and awards,
demonstrating their potential to reach maturity.

Weaknesses

- It is not sufficiently detailed how the new skills build on existing skills and how they contribute to the researcher's professional maturity.



Impact
Strengths:
- The fellowship will enhance the scientific, clinical and technical scope of the researcher, thus widening the potential career opportunities.
- There is an adequate strategy and plan to disseminate to the scientific community with publications in peer-reviewed journals, presentations
at conferences and scientific press releases.
- There is a strategy for commercialisation of the results.

Weaknesses
- I is not sufficiently clear how the project will enhance the career prospects of the researcher.
- Although the strong track record of technology transfer and IP of the host is outlined, the scope of the current project is limited in terms of
potential for technology transfer and IP applications.
- Dissemination to stakeholders like pharma companies, regulatory bodies, and policy makers is insufficiently addressed.

JimmyQuan
Posts: 6
Joined: Wed Jun 20, 2018 10:24 am

Re: 2018 Marie Curie Individual Fellowship (H2020-MSCA-IF-2018)

Post by JimmyQuan » Wed Feb 13, 2019 1:16 pm

Hi guys, I got 90.6% last year for the first submission and was on the reserve list. I am lucky enough that my score went up to 96.4% this time. I know it is pain if the score drops significantly for the second/third time submission. I am sorry for those suffering this pain. I have learned a lot from this forum and would like to share some of my opinions. Hopefully this will help someone.

MY suggestion is 'DO NOT amend your proposal ONLY/TOO MUCH to fix the weakness mentioned in the ESR'. This only makes the previous reviewer happy, and more importantly, he is not going to be the one review your proposal this year. You can judge this year's reviewer by saying he doesn't appreciate your hard work and enormous hope at all. But, maybe the previous reviewer didn't point our/realised many of the weakness of your proposal either. This is why even I got 90.6% last time, I still changed REALLY A LOT of the proposal. As everybody probably know, please pay your best attentions to the details in the instruction for proposal.

AdinaBabesh
Posts: 165
Joined: Wed Jan 23, 2019 4:24 pm

Re: 2018 Marie Curie Individual Fellowship (H2020-MSCA-IF-2018)

Post by AdinaBabesh » Wed Feb 13, 2019 1:26 pm

Thank you very much for your input. I don't really know if too many details make a lot of difference. I brought a lot of details, including how many students I will supervise at the MA level, how many posts I will put on social media, when I will do that, and why, which conferences I will attend and when, in which journals I will publish, I put even the working title of my papers. And it didn't help at all. My score is still 75. One reviewer penalized me for having too many deliverables (11): 2 papers, 2 conferences, 4 social media posts, 2 press articles, 1 Marie Curie ambassador day in my own country. And I also mentioned how all these deliverables look like.
JimmyQuan wrote:
Wed Feb 13, 2019 1:16 pm
Hi guys, I got 90.6% last year for the first submission and was on the reserve list. I am lucky enough that my score went up to 96.4% this time. I know it is pain if the score drops significantly for the second/third time submission. I am sorry for those suffering this pain. I have learned a lot from this forum and would like to share some of my opinions. Hopefully this will help someone.

MY suggestion is 'DO NOT amend your proposal ONLY/TOO MUCH to fix the weakness mentioned in the ESR'. This only makes the previous reviewer happy, and more importantly, he is not going to be the one review your proposal this year. You can judge this year's reviewer by saying he doesn't appreciate your hard work and enormous hope at all. But, maybe the previous reviewer didn't point our/realised many of the weakness of your proposal either. This is why even I got 90.6% last time, I still changed REALLY A LOT of the proposal. As everybody probably know, please pay your best attentions to the details in the instruction for proposal.

Orion
Posts: 1
Joined: Wed Feb 13, 2019 1:17 pm

Re: 2018 Marie Curie Individual Fellowship (H2020-MSCA-IF-2018)

Post by Orion » Wed Feb 13, 2019 1:50 pm

I think that each year the context is different….considering the context as the all proposals together and different reviewers….

In the case of trying a re-submission, it is very important to read and interpret the comments of the reviewers (and try to improve the proposal in all aspects). Obviously, the order of the proposals is not obtained only of two scores of two reviewers. It is probably the final score of a proposal is a mixture of the scores of the reviewers modulated by the “commission meeting” for deciding which proposals must be funded (according to the topics in each area). It is probably that several proposals are addressing the same topic with minimal differences.

It is probably that some strange comments appear after the meeting of the commission. I think it is important to read carefully the comments and try to guess if your proposal arrives at the meeting, and finally is not funded.

CountZ
Site Admin
Posts: 209
Joined: Thu Jun 01, 2017 7:14 pm

Re: 2018 Marie Curie Individual Fellowship (H2020-MSCA-IF-2018)

Post by CountZ » Wed Feb 13, 2019 2:49 pm

Thanks for this!

We were considering the article on the lottery system. I still have to read it. My opinion (without having read the article yet) is that such a system would fail to differentiate between very good proposals (that should get funded) and good proposals (that shouldn't be funded given limited resources).

p3dr0 wrote:
Wed Feb 13, 2019 12:56 pm
My two cents on this.

You cannot fight against unfair (or perceived as unfair, which is a very important nuance) reviews with limited review time and multiple reviewers. If you believe it was unfair, as stated by the EC "you may request an ‘evaluation review’ on the procedural aspects of the evaluation (not the merits of your proposal)". I did it for what I considered being an unfair review two years ago, it did not change anything. Unfair, yes, but that's the name of the game.

What is quite different, in my opinion, is getting a much lower score on re-submission. Here it shows that basically the full evaluation process is flawed, and that the full thing is a lottery (actually, some authors have suggested using a partial lottery system, see eg Gross & Bergstrom 2019 PLOS Biology).

If you want to follow the op-ed route, the second phenomenon appears more problematic to me (differences of more than 20% in the final mark from the same starting material???). The first one is trivial: of course you may get conflicting reviews depending on the level of expertise of the reviewers within the panel. Still, if you want to discuss this "unfairness", is the number of evaluation reviews asked per year available? This could be a good indicator. Another thing to keep in mind: for one unfair review, how many fair ones? And which ratio is acceptable?
p3dr0 wrote:
Wed Feb 13, 2019 12:56 pm
Still, if you want to discuss this "unfairness", is the number of evaluation reviews asked per year available? This could be a good indicator.
What do you mean by the number of evaluation reviews per year?

IF ST LIF
Posts: 132
Joined: Tue Jan 16, 2018 7:10 pm

Re: 2018 Marie Curie Individual Fellowship (H2020-MSCA-IF-2018)

Post by IF ST LIF » Wed Feb 13, 2019 3:26 pm

p3dr0 wrote:
Wed Feb 13, 2019 12:56 pm
CountZ wrote:
Wed Feb 13, 2019 10:49 am
Hi again everyone

We are writing a complaint to the Commission about unfair evaluation processes, but we need your help.
If you think that the evaluation is unfair, please tell us why. Maybe you got a much lower score on re-submission. Maybe you don't think your proposal should have received the high/low score it did. The goal is the get the Commission to rethink its evaluation process, so as to make this less of a lottery, and instead to reward the best proposals.
My two cents on this.

You cannot fight against unfair (or perceived as unfair, which is a very important nuance) reviews with limited review time and multiple reviewers. If you believe it was unfair, as stated by the EC "you may request an ‘evaluation review’ on the procedural aspects of the evaluation (not the merits of your proposal)". I did it for what I considered being an unfair review two years ago, it did not change anything. Unfair, yes, but that's the name of the game.

What is quite different, in my opinion, is getting a much lower score on re-submission. Here it shows that basically the full evaluation process is flawed, and that the full thing is a lottery (actually, some authors have suggested using a partial lottery system, see eg Gross & Bergstrom 2019 PLOS Biology).

If you want to follow the op-ed route, the second phenomenon appears more problematic to me (differences of more than 20% in the final mark from the same starting material???). The first one is trivial: of course you may get conflicting reviews depending on the level of expertise of the reviewers within the panel. Still, if you want to discuss this "unfairness", is the number of evaluation reviews asked per year available? This could be a good indicator. Another thing to keep in mind: for one unfair review, how many fair ones? And which ratio is acceptable?
Totally agree. This is crystal clear indicator of the weaknesses of the system. I've been discussing this morning with the OIP office of my host and they've told me that almost every year they have at least one case that gets a big drop after a resubmission. This is a real issue with their system, lack of consistency. It's a total flaw that the strengths of the first year turns into weaknesses based on the subjective view of an "expert" -I would like to see the name of the evaluators- that spend few minutes per application.

AdinaBabesh
Posts: 165
Joined: Wed Jan 23, 2019 4:24 pm

Re: 2018 Marie Curie Individual Fellowship (H2020-MSCA-IF-2018)

Post by AdinaBabesh » Wed Feb 13, 2019 3:39 pm

In the evaluation letter, it is this:

"You may request an evaluation review on the procedural aspects of the evaluation (not the merits of your proposal). This request must be submitted by the coordinator (via the following link: https://webgate.ec.europa.eu/redress-fr ... work.iface) — within 30 days after receiving this letter."

AdinaBabesh
Posts: 165
Joined: Wed Jan 23, 2019 4:24 pm

Re: 2018 Marie Curie Individual Fellowship (H2020-MSCA-IF-2018)

Post by AdinaBabesh » Wed Feb 13, 2019 3:57 pm

Information on the means of redress
You may request an evaluation review on the procedural aspects of the evaluation (not the merits
of your proposal). This request must be submitted by the coordinator (via the following link:
https://webgate.ec.europa.eu/redress-fr ... work.iface) — within 30 days after receiving this letter.
You may request a legal review of the procedural aspects of the evaluation (not the merits of your proposal)
under Article 22 of Council Regulation No 58/20031
(‘Article 22 request’) — within 1 month of receiving
this letter (via the following link: EAC-FOR-APPEALS-UNDER-ART-22-OF-REG-58-2003@ec.europa.eu).
You may bring an action for annulment under Article 263 of the Treaty on the Functioning of the European
Union (‘Article 263 action’) against the Agency — within 2 months of receiving this letter.
Please be aware that you cannot take more than one formal action at a time. Thus, if you make, for instance,
a request for evaluation review, you cannot — at the same time — take any other action (e.g. also file an
Article 22 request or an Article 263 TFEU action). If you file an Article 22 request, you cannot — at the same
time — bring an Article 263 action.
You must wait for the final decision of the Commission/Agency and can then take further action against that
decision. All deadlines will start to run from when you receive the final decision.

AdinaBabesh wrote:
Wed Feb 13, 2019 3:39 pm
In the evaluation letter, it is this:

"You may request an evaluation review on the procedural aspects of the evaluation (not the merits of your proposal). This request must be submitted by the coordinator (via the following link: https://webgate.ec.europa.eu/redress-fr ... work.iface) — within 30 days after receiving this letter."

Geezer_MSCA
Posts: 27
Joined: Sat Feb 09, 2019 3:48 pm

Re: 2018 Marie Curie Individual Fellowship (H2020-MSCA-IF-2018)

Post by Geezer_MSCA » Wed Feb 13, 2019 4:07 pm

IF ST LIF wrote:
Wed Feb 13, 2019 3:26 pm
p3dr0 wrote:
Wed Feb 13, 2019 12:56 pm
CountZ wrote:
Wed Feb 13, 2019 10:49 am
Hi again everyone

We are writing a complaint to the Commission about unfair evaluation processes, but we need your help.
If you think that the evaluation is unfair, please tell us why. Maybe you got a much lower score on re-submission. Maybe you don't think your proposal should have received the high/low score it did. The goal is the get the Commission to rethink its evaluation process, so as to make this less of a lottery, and instead to reward the best proposals.
My two cents on this.

You cannot fight against unfair (or perceived as unfair, which is a very important nuance) reviews with limited review time and multiple reviewers. If you believe it was unfair, as stated by the EC "you may request an ‘evaluation review’ on the procedural aspects of the evaluation (not the merits of your proposal)". I did it for what I considered being an unfair review two years ago, it did not change anything. Unfair, yes, but that's the name of the game.

What is quite different, in my opinion, is getting a much lower score on re-submission. Here it shows that basically the full evaluation process is flawed, and that the full thing is a lottery (actually, some authors have suggested using a partial lottery system, see eg Gross & Bergstrom 2019 PLOS Biology).

If you want to follow the op-ed route, the second phenomenon appears more problematic to me (differences of more than 20% in the final mark from the same starting material???). The first one is trivial: of course you may get conflicting reviews depending on the level of expertise of the reviewers within the panel. Still, if you want to discuss this "unfairness", is the number of evaluation reviews asked per year available? This could be a good indicator. Another thing to keep in mind: for one unfair review, how many fair ones? And which ratio is acceptable?
Totally agree. This is crystal clear indicator of the weaknesses of the system. I've been discussing this morning with the OIP office of my host and they've told me that almost every year they have at least one case that gets a big drop after a resubmission. This is a real issue with their system, lack of consistency. It's a total flaw that the strengths of the first year turns into weaknesses based on the subjective view of an "expert" -I would like to see the name of the evaluators- that spend few minutes per application.
For you and all those who want to have an idea of who the experts are, search on Google the following

h2020-expertslists-excellent-msca-2016_en

The first result should be an excel file (of the same name) with the list of all the evaluators for the 2016 MSCA-IF

Locked