Digital Identifiers Improve Recognition and Credit: ORCID

Little-purple-orchid_Dale_Flickr-CC-BY_690x320

Connectivity and integrated data are moving ahead as the best minds in metadata, digital identifiers and networked discovery collaborate to ensure that appropriate credit and recognition are given for scientific outputs of all types. Once

The post Digital Identifiers Improve Recognition and Credit: ORCID appeared first on PLOS Blogs Network.

Some notes on "Citations for Sale" about King Abdulaziz University offerring me $$ to become an adjunct faculty

There is a news story in the Daily Cal titled "Citations for Sale" by Megan Messerly about King Abdullah University King Abdulaziz University trying to pay researchers who are highly cited to become adjunct professors there to boost their rankings.  This article stemmed from a blog post by Lior Pachter.  I was interviewed by the Daily Cal reporter about this because I had sent Lior some of the communications I had had with people from KAU where they tried to get me to do this.

I am posting here some of the email discussions / threads that I shared with Lior and Megan.

Thread #1.

Here is one thread of emails in which KAU tried to get me to become an Adjunct Professor.  I have pulled out the text of the emails and removed the senders ID just in case this would get him in trouble.

Received this email 3/6/14
Dear Prof Jonathan, 
How are you? I hope every thing is going well. I hope to share work with you and initiate a collaboration between you and me in biology department, Kind Abdulaziz university . My research focuses on (redacted detail). I hope that you would agree . 
Redacted Name,Ph.D
Assistant Professor, Faculty of Sciences, King Abdulaziz University, Jeddah, KSA.
My response:
What kind of collaboration are you imagining?
His response:
Hi Prof Jonathan, 
Let me to explain that the king abdulaziz university initiated a project which is called Highly Cited Professor ( HiCi). This project can provide a contract between you and our university and from this contract you can get 7000 US$ as a monthly salary .So  this project will allow you to generate two research proposal between you and a research stuff here in order to make an excellent publications in high quality journals as you always do. 
I hope that I was clear. I' m looking forward to hear from you. Finally, I think that a very good chance to learn from you. 
 Another email from him:
Dear prof Jonathan, 
I' d like to tell that Prof Inder Verma will come tomorrow to our university  as a highly cited professor and he also signed a contract with us. At March 28 Prof Paul Hebert will come to our university and we actually generated two projects with Prof Paul. I hope you trust me and you call Prof  Inder and  Paul to be sure. 
From me:
I trust you - just am very busy so not sure if this is right for me
Sent from my iPhone
From him:
You will come to our university for just two visits annually and each visit will take only  one week. Take you time to think. Bye
Another email from him
Seat Dr Jonathan, 
What is your decision?

My response:
You have not really provided me with enough information about this.
From him:
Well, you will sign a contract as a highly cited professor between you and KAU. if it happen you will get 7,000 US$ per month for one year as a salary.  From  this project you would be able to generate two proposal with around 200,000 US$ and you will get incentives from each one. In the further we can initiate a mega project with 1.5 million US$.   Is that clear? 
From me:
I could use a formal , legal description of the agreement that one is expected to sign 
From him:
You can ask Prof Dr. Inder Verma he is now in my department and he did two presentation today. Also you can ask my professor prof  Paul Hebert, biodiversity institute of Ontario who will come to my department in March 28,2014.
From him:
if you would agree . Coul you please provide me with your CV with list of publication? 
From him:
Are you agree or no?
From me:
No 
You have not provided me with anywhere near enough info to evaluate this 
Do you have any legal agreement I can look at?
From him:
Agreement from KAU
without providing me with your CV I could not be able to talk to university administration. I told you before ask under verma or Paul Hebert both of them have contract. Dr verma " editor in chief of PNAS who is left KAS since 4 hours ago. Finally, its up to.
From me:
No thanks
Not interested from what you have told me

Thread #2

Received this email on 12/17/13

Dr. Mansour Almazroui
12/17/13
to jaeisen
Dear Prof. Jonathan Eisen ,

I am Dr. Mansour Almazroui, Highly Cited Program Manager, at King Abdulaziz University (KAU), Jeddah, Saudi Arabia. On behalf of KAU with great pleasure, I would like to invite you to join our innovative collaboration program that is called “International Affiliation program”.

KAU is considered as the largest university in the region serving more than 150,000 students, with around 4,000 faculty members and 30 colleges. For more information please locate us at:  http://www.kau.edu.sa.

The envisaged program aims to elevate our local research activities in various fields. We only extend our invitation to highly ranked researchers like you, with a solid track record in research and publications to work with KAU professors.

Joining our program will immediately put you on an annual contract, as a Distinguished Adjunct Professor. In this regard, you will only be required to work at KAU premises for three weeks in each year of your contract.

We hope you to accept our invitation and looking forward to welcome you.  Please don’t hesitate to contact me for any further query or clarification.

Sincerely,
Mansour
--
--------------------------------------------------------------------------------------
Dr. Mansour Almazroui
Highly Cited Program Manager,
Office of the Vice President for Graduated Studies and Research,
King Abdulaziz University (KAU).
&
Director, Center of Excellence for Climate Change Research
King Abdulaziz University
P. O. Box 80234, Jeddah 21589,
Saudi Arabia
I wrote back

I am intrigued but need more information about the three weeks of time at KAU and the details on the contract. 
Jonathan Eisen  
Sent from my iPhone
Got this back

Dear Prof. Jonathan Eisen , 
Hope this email finds you in good health. Thank you for your interest. Please find below the information you requested to be a “Distinguished Adjunct Professor” at KAU. 
1. Joining our program will put you on an annual contract initially for one year but further renewable. However, either party can terminate           its association with one month prior notice.
2. The Salary per month is $ 6000 for the period of contract.
3. You will be required to work at KAU premises for three weeks in each contract year. For this you will be accorded with expected three         visits to KAU.
4. Each visit will be at least for one week long but extendable as suited for research needs.
5. Air tickets entitlement will be in Business-class and stay in Jeddah will be in a five star hotel. The KAU will cover all travel and living             expenses of your visits.
6. You have to collaborate with KAU local researchers to work on KAU funded (up to $100,000.00) projects.
7. It is highly recommended to work with KAU researchers to submit an external funded project by different agencies in Saudi Arabia.
8. May submit an international patent.
9. It is expected to publish some papers in ISI journals with KAU affiliation.
10. You will be required to amend your ISI highly cited affiliation details at the ISI highlycited.com web site to include your employment and         affiliation with KAU.   
Kindly let me know your acceptance so that the official contract may be preceded.
Sincerely,
Mansour
I promtly forwarded this to my brother with a note:
One way to make some extra money ... Sell your reputation / ISI index  
Sent from my iPhone
And my brother eventually shared this with Lior  ...

UPDATE 1: 12/5/2014

One key question is - what are the rules and guidelines and ehitcs of listing affiliations on papers.  Here are some tidbits on this

From Nature Communications:
The primary affiliation for each author should be the institution where the majority of their work was done.
From Taylor and Francis
The affiliations of all named co-authors should be the affiliation where the research was conducted.
From SAGE
Present the authors' affiliation addresses (where the actual work was done) below the names.

UPDATE 2: Some other posts of relevance

UPDATE 3: A Storify

Suggestion of the week: Create Project Specific Pages on ImpactStory #AltMetrics

So - I have been doing a little "hacking" of the Impact Story system to create pages specific for individual projects rather than for me or other researchers.  I did this last week for my microBEnet project: Made a project page (hack?) for microBEnet on ImpactStory.  And been playing around with the concept some more.

For example see this page I made for the "iSEEM2: Environmental Niche Atlas" project that is a collaboration between my lab and the lab of Katie Pollard at UCSF (supported by the Gordon and Betty Moore Foundation).  To do this, I registered a new account in ImpactStory (with the first name i and last name SEEM2; using an alternative email address I have). I then used the "upload individual products" and loaded up Pubmed IDs, DOIs, Github web addresses, Slideshare web addresses and more.  And Voila I get I nice page with Altmetrics for our project rather than for myself.




Now I have not loaded everything done on this project yet, but already this is a helpful way to post results from our project and look at some of their metrics. I also updated the website for the project: http://iseem2.wordpress.com.

I think making such project specific pages will end up being useful in many ways. I discovered one this AM in an email I got from Impact Story.  I have appended it below.  Turns out they give weekly updates on how your metrics have changed for that week.  This is the best thing I have seen regarding "Alt Metrics" anywhere.  Very very useful.  Still not sure if this is an "acceptable" use of ImpactStory but I figure they should be OK with it.





Impactstory logo

Your new research impacts this week

user avatari SEEM2impactstory.org/iSEEM2

20+ profile SlideShare downloads

on https://impactstory.org/iSEEM2

One or more of the 31 products on your profile attracted a combined 8 new SlideShare downloads this week, bringing you to 22 total.
Congrats on passing the 20 mark!
profile milestone

Welcome to the SlideShare favorites club!

on https://impactstory.org/iSEEM2

Congratulations, you just got your first SlideShare favorites!
profile milestone
That brings this video up to 232 YouTube views total.
It marks your 1st product to get this many views on YouTube. Nice work!
video new metrics
That brings this video up to 221 YouTube views total.
It marks your 2nd product to get this many views on YouTube. Nice work!
video new metrics
This article attracted 4 new Scopus citations this week, bringing it up to 40 total.
It marks your 1st product to get this many citations on Scopus. Nice work!
article milestone
That brings this article up to 29 Scopus citations total.
Impressive! Only 1% of 2012 article have reached that many citations.
It marks your 2nd product to get this many citations on Scopus. Nice work!
article new metrics
This slides attracted 83 new SlideShare views this week, bringing it up to 83 total.
It marks your 3rd product to get this many views on SlideShare. Nice work!
slides milestone

First Delicious bookmarks

on Systematic identification of gene families for use as "markers" for phylogenetic and phylogeny-driven ecological studies of bacteria and archaea and their major subgroups.

This article attracted 1 new Delicious bookmarks this week, bringing it up to 1 total.
It marks your 4th unique product to get a bookmarks on Delicious. Nice work!
article milestone

First SlideShare downloads

on Phylogeny-Driven Approaches to Genomics and Metagenomics - talk by Jonathan Eisen at Fresno State May 6, 2013

This slides attracted 7 new SlideShare downloads this week, bringing it up to 7 total.
It marks your 3rd unique product to get a downloads on SlideShare. Nice work!
slides milestone

First SlideShare favorites

on Phylogeny-Driven Approaches to Genomics and Metagenomics - talk by Jonathan Eisen at Fresno State May 6, 2013

This slides attracted 2 new SlideShare favorites this week, bringing it up to 2 total.
It marks your 1st unique product to get a favorites on SlideShare. Nice work!
slides milestone

How Open Are You? Part 1: Metrics to Measure Openness and Free Availability of Publications

For many many years I have been raising a key questions in relation to open access publishing - how can we measure how open someone's publications are.  Ideally we would have a way of measuring this in some sort of index.  A few years ago I looked around and asked around and did not find anything out there of obvious direct relevance to what I wanted so I started mapping out ways to do this.

When Aaron Swartz died I started drafting some ideas on this topic.  Here is what I wrote (in January 2013) but never posted:


With the death of Aaron Swartz on Friday there has been much talk of people posting their articles online (a short term solution) and moving more towards openaccess publishing (a long term solution).  One key component of the move to more openaccess publishing will be assessing people on just how good a job they are doing of sharing their academic work.

I have looked around the interwebs to see if there is some existing metric for this and I could not find one.  So I have decided to develop one - which I call the Swartz Openness Index (SOI).


Let A = # of objects being assessed (could be publications, data sets, software, or all of these together). 
Let B = # of objects that are released to the commons with a broad, open license. 
A simple (and simplistic) metric could be simply 
OI = B / A

This is a decent start but misses out on the degree of openness of different objects. So a more useful metric might be the one below.
A and B as above. 
Let C = # of objects available free of charge but not openly 
OI = ( B + (C/D) ) / A  
where D is the "penalty" for making material in C not openly available

This still seems not detailed enough.  A more detailed approach might be to weight diverse aspects of the openness of the objects.  Consider for example the "Open Access Spectrum."  This has divided objects (publications in this case) into six categories in terms of potential openness: reader rights, reuse rights, copyrights, author posting rights, automatic posting, and machine readability.  And each of these is given different categories that assess the level of openness.  Seems like a useful parsing in ways.  Alas, since bizarrely the OAS is released under a somewhat restrictive CC BY-NC-ND  license I cannot technically make derivatives of it.  So I will not.  Mostly because I am pissed at PLoS and SPARC for releasing something in this way.  Inane.

But I can make my own openness spectrum.


And then I stopped writing because I was so pissed off at PLOS and SPARC for making something like this and then restricting it's use.  I had a heated discussion with people from PLOS and SPARC about this but not sure if they updated their policy.  Regardless, the concept of an Openness Index of some kind fell out of my head after this buzzkill.  And it only just now came back to me. (Though I note - I did not find the Draft post I made until AFTER I wrote the rest of this post below ... ).


To get some measure of openness in publications maybe a simple metric would be useful.  Something like the following
  • P = # of publications
  • A = # of fully open access papers
  • OI = Openness index
A simple OI would be
  • OI = 100 * A/P
However, one might want to account for relative levels of openness in this metric.  For example
  • AR = # of papers with a open but somewhat restricted license
  • F = # of papers that are freely available but not with an open license
  • C = some measure of how cheap the non freely available papers are
And so on.

Given that I am not into library science myself and not really familiar with playing around with this type of data I thought a much simpler metric would be to just go to Pubmed (which of course works only for publications in the arenas covered by Pubmed).

From Pubmed one can pull out some simple data. 
  • # of publications (for a person or Institution)
  • # of those publications in PubMed Central (a measure of free availability)
Thus one could easily measure the "Pubmed Central" index as

PMCI = 100 * (# publications in PMC / # of publications in Pubmed)

Some examples of the PMCI for various authors including some bigger names in my field, and some people I have worked with.

            Name                        #s                PMCI    
Eisen JA
224/269  
83.2
Eisen MB 
76/104
73.1
Collins FS
192/521
36.8
Lander ES
160/377
42.4
Lipman DJ
58/73
79.4
Nussinov R
170/462
36.7
Mardis E
127/187
67.9
Colwell RR
237/435
54.5
Varmus H
165/408
40.4
Brown PO
164/234
70.1
Darling AE
20/27
74.0
Coop G
23/39
59.0
Salzberg SL
107/162
61.7
Venter JC
53/237
22.4
Ward NL
24/58
41.4
Fraser CM
78/262
29.8
Quackenbush J
95/225
42.2
Ghedin E
47/82
57.3
Langille MG
10/14
71.4




And so on.  Obviously this is of limited value / accuracy in many ways.  Many papers are freely available but not in Pubmed Central.  Many papers are not covered by Pubmed or Pubmed Central.  Times change, so some measure of recent publications might be better than measuring all publications.  Author identification is challenging (until systems like ORCID get more use).  And so on.

Another thing one can do with Pubmed is to identify papers with free full text available somewhere (not just in PMC).  This can be useful for cases where material is not put into PMC for some reason.  And then with a similar search one can narrow this to just the last five years.  As openaccess has become more common maybe some people have shifted to it more and more over time (I have -- so this search should give me a better index).

Lets call the % of publications with free full text somewhere the "Free Index" or FI.  Here are the values for the same authors.

Name
PMC 
%
Pudmed 
PMCI 
Free
%
Pubmed
5 years
FI - 5 
Free
%
Pubmed
All
FI-ALL
Eisen JA
224/269
83.2
178/180
98.9
237
88.1
Eisen MB 
76/104
73.1
32/34
94.1
8379.8
Collins FS
192/521
36.8
104/128
81.3
26350.5
Lander ES
160/377
42.4
78/104
75.0
20053.1
Lipman DJ
58/73
79.4
20/22
90.9
5980.8
Mardis E
127/187
67.9
90/115
78.3
13572.2
Colwell RR
237/435
54.5
31/63
49.2
25859.3
Varmus H
165/408
40.4
21/28
75.0
20650.5
Brown PO
164/234
70.1
20/21
95.2
18579.0
Darling AE
20/27
74.0
18/21
85.7
2177.8
Coop G
23/39
59.0
16/20
80.0
2871.8
Salzberg SL
107/162
61.7
54/58
93.1
12879.0
Venter JC
53/237
22.4
20/33
60.6
8535.9
Ward NL
24/58
41.4
18/27
66.6
3051.7
Fraser CM
78/262
29.8
9/13
69.2
10941.6
Quackenbush J
95/225
42.2
54/75
72.0
13158.2
Ghedin E
47/82
57.3
30/36
83.3
5668.3
Langille MG
10/14
71.4
11/13
84.6
1178.6


Very happy to see that I score very well for the last five years. 180 papers in Pubmed.  178 of them with free full text somewhere that Pubmed recognizes. The large number of publications comes mostly from genome reports in the open access journals Standards in Genomic Sciences and Genome Announcements.  But most of my non genome report papers are also freely available.

I think in general it would be very useful to have measures of the degree of openness.  And such metrics should take into account sharing of other material like data, methods, etc.  In a way this could be a form of the altmetric calculations going on.

But before going any further I decided to look again into what has been done in this area. When I first thought of doing this a few years ago I searched and asked around and did not see much of anything.  (Although I do remember someone out there - maybe Carl Bergstrom - saying there were some metrics that might be relevant - but can't figure out who / what this information in the back of my head is).

So I decided to do some searching anew.  And lo and behold there was something directly relevant. There is a paper in the Journal of Librarianship and Scholarly Communication called: The Accessibility Quotient: A New Measure of Open Access.  By Mathew A. Willmott, Katharine H. Dunn, and Ellen Finnie Duranceau from MIT.

Full Citation: Willmott, MA, Dunn, KH, Duranceau, EF. (2012). The Accessibility Quotient: A New Measure of Open Access. Journal of Librarianship and Scholarly Communication 1(1):eP1025. http://dx.doi.org/10.7710/2162-3309.1025

Here is the abstract:

Abstract
INTRODUCTION The Accessibility Quotient (AQ), a new measure for assisting authors and librarians in assessing and characterizing the degree of accessibility for a group of papers, is proposed and described. The AQ offers a concise measure that assesses the accessibility of peer-reviewed research produced by an individual or group, by incorporating data on open availability to readers worldwide, the degree of financial barrier to access, and journal quality. The paper reports on the context for developing this measure, how the AQ is calculated, how it can be used in faculty outreach, and why it is a useful lens to use in assessing progress towards more open access to research.
METHODS Journal articles published in 2009 and 2010 by faculty members from one department in each of MIT’s five schools were examined. The AQ was calculated using economist Ted Bergstrom’s Relative Price Index to assess affordability and quality, and data from SHERPA/RoMEO to assess the right to share the peer-reviewed version of an article.
RESULTS The results show that 2009 and 2010 publications by the Media Lab and Physics have the potential to be more open than those of Sloan (Management), Mechanical Engineering, and Linguistics & Philosophy.
DISCUSSION Appropriate interpretation and applications of the AQ are discussed and some limitations of the measure are examined, with suggestions for future studies which may improve the accuracy and relevance of the AQ.
CONCLUSION The AQ offers a concise assessment of accessibility for authors, departments, disciplines, or universities who wish to characterize or understand the degree of access to their research output, capturing additional dimensions of accessibility that matter to faculty.

I completely love it.  After all. it is directly related to what I have been thinking about and, well, they actually did some systematic analysis of their metrics.  I hope more things like this come out and are readily available for anyone to calculate.  Just how open someone is could be yet another metric used to evaluate them ...

And then I did a little more searching and found the following which also seem directly relevant

So - it is good to see various people working on such metrics.  And I hope there are more and more.

Anyway - I know this is a bit incomplete but I simply do not have time right now to turn this into a full study or paper and I wanted to get these ideas out there.  I hope someone finds them useful ...

Playing with Impact Story to look at Alt Metrics for my papers, data, etc

The future of science will include in part better evaluations of the impact of individual scientists, individual papers and individual other units such as data sets, software, presentations, etc.

 There are many efforts in this area of "Alt Metrics" and one I have been playing around with recently is Impact Story. It used to be called Total Impact but they changed their name and some of their focus. It is pretty easy to use.

 One thing you can do is to create "A Collection." To do this you go to their site, you register, and then you select "Create Collection". And you add some information there

Among the information you can include: 

  • ORCID ID: ORCID is a new system for unique author IDs.  Once you get your unique ID you can curate / update your papers at the site (the site needs some work ... some issues there with duplication).  I have gotten my ORCID ID and updating my publications there.
  • Articles from Google Scholar profile.  This allows one to upload a Bibtext fuile of one's publication list from Google Scholar.  To get this, you need a Google Scholar page.  I have one here.  I have been playing a lot with Google Scholar recently: The Tree of Life: Wow - Google Scholar "Updates" a big step forward ... and The Tree of Life: Thank you Google Scholar Updates for finding me ... but did not realize it had a Bibtex export function until now.  From the drop down menu one selects "Export" and then can export ones publications (in the screen capture below the default option is Actions).  Once you get a Bibtex file you can upload it to ImpactStory.
  • Article and Dataset IDs.  Here one can Pubmed IDs or DOIs for other publications or datasets. Since most / all of my papers are in my Bibtext export and Orcid ID what I imagine using this for is data from places like Figshare and DataDryad
  • Webpage URLs.  One can include URLs here.  But so far my experience has been that they do not have a good system of assessing webpages.
  • Slideshare username.  If you are not posting slides and other materials on Slideshare, get with the program.  I post all my talks there.  And other things.  
  • Github Username.  A good place to post code/software.  We are doing this more and more in my lab.  I have a username though I don't do much there myself.  
And then give your collection a name and click go.  It takes a bit of time to finish the initial collection creation with my list of materials.  But it is fascinating and very useful once done.  Here is a link to a collection "Jonathan Eisen try #3" I recently made.  I have not added everything to it but it is still a good record of how many of my contributions are being used.



My favorite thing to do so far is to click "expand all" from the menu which then shows the detailed Alt Metrics for everything.  



  • PDF views.
  • HTML Views. 
  • Facebook shares.  
  • Twitter shares.  
  • And much more. 

It does not seem perfect - not sure how the metrics are quantified for things like Twitter and Facebook.  But it gives a decent indication of how much chatter and use there is of various materials.

And you can export all the information for your own private use.  I can imagine this being VERY useful for promotion/tenure/other review actions.

I also sniffed around the site and found some nice features from their api page.  I especially like the embed function for specific DOIs.  You copy their text and change the DOI and you get a nice graphical summary of Alt Metrics for that DOI.  See an example at the bottom of the post.  Am probably going to add this to my publication lists on the web.

It is important to realize this is a BETA version. Still needs some work. But LOTS of cool things to play with. The future is here and I like it. Time to end reliance on indirect measures of the impact of papers and data (e.g., Journal Impact Factor). Time to measure actual impact. And this is a good tool to help do that.



doi:10.1371/journal.pone.0018011










More playing around with Total Impact

I am continuing to play around with Total Impact (see for example total-impact: Jonathan Eisen). This is a new (beta) system for tracking individual impact of scientific productivity including papers, presentations, data, etc.

So far I like the general things I am seeing there.  They ask for feedback on their site and in the interest of openness I am posting some things I would love to see here

1. Sorting by publication date or any of the metadata categories (e.g., citations, downloads)
2. Better way of saving DOI lists such that if you get a new publication you can just add to the list

...

Lots of other things obviously but it is an early beta version so I am willing to be patient.  Definitely worth playing around with.