Movatterモバイル変換


[0]ホーム

URL:


Wayback Machine
1,251 captures
23 Sep 2004 - 28 Jan 2026
OctNOVDec
02
201620172018
success
fail
COLLECTED BY
Organization:Archive Team
Formed in 2009, the Archive Team (not to be confused with the archive.org Archive-It Team) is a rogue archivist collective dedicated to saving copies of rapidly dying or deleted websites for the sake of history and digital heritage. The group is 100% composed of volunteers and interested parties, and has expanded into a large amount of related projects for saving online and digital history.

History is littered with hundreds of conflicts over the future of a community, group, location or business that were "resolved" when one of the parties stepped ahead and destroyed what was there. With the original point of contention destroyed, the debates would fall to the wayside. Archive Team believes that by duplicated condemned data, the conversation and debate can continue, as well as the richness and insight gained by keeping the materials. Our projects have ranged in size from a single volunteer downloading the data to a small-but-critical site, to over 100 volunteers stepping forward to acquire terabytes of user-created data to save for future generations.

The main site for Archive Team is atarchiveteam.org and contains up to the date information on various projects, manifestos, plans and walkthroughs.

This collection contains the output of many Archive Team projects, both ongoing and completed. Thanks to the generous providing of disk space by the Internet Archive, multi-terabyte datasets can be made available, as well as in use by theWayback Machine, providing a path back to lost websites and work.

Our collection has grown to the point of having sub-collections for the type of data we acquire. If you are seeking to browse the contents of these collections, the Wayback Machine is the best first stop. Otherwise, you are free to dig into the stacks to see what you may find.

The Archive Team Panic Downloads are full pulldowns of currently extant websites, meant to serve as emergency backups for needed sites that are in danger of closing, or which will be missed dearly if suddenly lost due to hard drive crashes or server failures.

ArchiveBot is an IRC bot designed to automate the archival of smaller websites (e.g. up to a few hundred thousand URLs). You give it a URL to start at, and it grabs all content under that URL, records it in a WARC, and then uploads that WARC to ArchiveTeam servers for eventual injection into the Internet Archive (or other archive sites).

To use ArchiveBot, drop by #archivebot on EFNet. To interact with ArchiveBot, you issue commands by typing it into the channel. Note you will need channel operator permissions in order to issue archiving jobs. The dashboard shows the sites being downloaded currently.

There is a dashboard running for the archivebot process athttp://www.archivebot.com.

ArchiveBot's source code can be found athttps://github.com/ArchiveTeam/ArchiveBot.

TIMESTAMPS
loading
The Wayback Machine - https://web.archive.org/web/20171102234006/http://opcit.eprints.org/oacitation-biblio.html

See alsoPapers produced by theproject

The effect of open access and downloads ('hits') on citation impact: abibliography of studies

Gargouri, Y., et al.,Self-Selected or Mandated, Open Access Increases Citation Impact for Higher Quality Research,PLOS ONE, 5(10): e13636, October 18, 2010GSBiblio

@ 25 June 2013
Find your way through the bibliography
Selected topic ALERTboxes:OA impact biblio rapid reader |Reviews of OA impact studies
Latest additions
Studies with original data
Web tools for measuring impact |Comparative reviews
Background
The financial imperative: correlating research access,impact and assessment |Citation analysis, indexes andimpact factors |Open access

Last updated 25 June 2013; first posted 15 September 2004. If you have any additions, corrections or comments, please tweet@stevehit #oaimpact oremail Steve Hitchcock.

What others say about this bibliography





Open Access Policies. This bibliography is cited in support of the following open access policies,statements and guidelines for authors:
Economics. This bibliography cited by

Introduction to the bibliography

Despite significant growth in the number of research papers available throughopen access, principally through author self-archiving in institutionalarchives,itis estimated that only c. 20% of the number of papers published annually areopen access. It is up to the authors of papers to change this. Why mightopen access be of benefit to authors? One universally important factor for allauthors is impact, typically measured by the number of times a paper is cited(some olderstudies have estimated monetary returns toauthors from article publication via the role citations play in determiningsalaries).Recent studies have begun to show that open access increasesimpact. More studies and more substantial investigations are needed toconfirm the effect, although asimple exampledemonstrates the effect.

This chronological bibliography is intended to describe progress inreporting these studies; it also lists the Web tools available to measureimpact. It is a focused bibliography, on the relationship between impact andaccess. It does not attempt to cover citation impact, or other related topicssuch as open access, more generally, although some key papers in these areasare listed as jump-off points for wider study.

Latest additions to the bibliography

Björn Brembs, Katherine Button and Marcus Munafò (2013)
Deepimpact: unintended consequences of journal rank
Front. Hum. Neurosci.,7:291, published online: 24 June 2013. Also in arXiv.org > cs> arXiv:1301.3748, v1, 16 Jan 2013 http://arxiv.org/abs/1301.3748
doi: 10.3389/fnhum.2013.00291
Abstract: Most researchers acknowledge an intrinsic hierarchy in thescholarly journals (journal rank) that they submit their work to, andadjust not only their submission but also their reading strategiesaccordingly. On the other hand, much has been written about thenegative effects of institutionalizing journal rank as an impactmeasure. So far, contributions to the debate concerning the limitationsof journal rank as a scientific impact assessment tool have eitherlacked data, or relied on only a few studies. In this review, wepresent the most recent and pertinent data on the consequences of ourcurrent scholarly communication system with respect to various measuresof scientific quality (such as utility/citations, methodologicalsoundness, expert ratings or retractions). These data corroborateprevious hypotheses: using journal rank as an assessment tool is badscientific practice. Moreover, the data lead us to argue that anyjournal rank (not only the currently-favored Impact Factor) would havethis negative impact. Therefore, we suggest that abandoning journalsaltogether, in favor of a library-based scholarly communication system,will ultimately be necessary. This new system will use moderninformation technology to vastly improve the filter, sort and discoveryfunctions of the current journal system.

Valeria Aman (2013)
The potential ofpreprints to accelerate scholarly communication - A bibliometricanalysis based on selected journals
arXiv.org > cs > arXiv:1306.4856, 20 Jun 2013
Master Thesis. Abstract: This paper quantifies to which extentpreprints in arXiv accelerate scholarly communication. The followingsubject fields were investigated up to the year 2012: High EnergyPhysics (HEP), Mathematics, Astrophysics, Quantitative Biology, andLibrary and Information Science (LIS). Publication and citation datawas downloaded from Scopus and matched with corresponding preprints inarXiv. Furthermore, the INSPIRE HEP database was used to retrievecitation data for papers related to HEP. The bibliometric analysisdeals with the growth in numbers of articles published having aprevious preprint in arXiv and the publication delay, which is definedas the chronological distance between the deposit of a preprint inarXiv and its formal journal publication. Likewise, the citation delayis analyzed, which describes the time it takes until the first citationof preprints, and articles, respectively. Total citation numbers arecompared for sets of articles with a previous preprint and thosewithout. The results show that in all fields but biology a significantcitation advantage exists in terms of speed and citation rates forarticles with a previous preprint version on arXiv.

Carsten Nieder, Astrid Dalhaug and Gro Aandahl (2013)
Correlationbetween article download and citation figures for highly accessedarticles from five open access oncology journals
SpringerPlus,2:261, 13 June 2013
doi:10.1186/2193-1801-2-261
Abstract(provisional): Different approaches can be chosen to quantify theimpact and merits of scientific oncology publications. These includesource of publication (including journal reputation and impact factor),whether or not articles are cited by others, and access/downloadfigures. When relying on citation counts, one needs to obtain access tocitation databases and has to consider that results differ from onedatabase to another. Accumulation of citations takes time and theirdynamics might differ from journal to journal and topic to topic.Therefore, we wanted to evaluate the correlation between citation anddownload figures, hypothesising that articles with fewer downloads alsoaccumulate fewer citations. Typically, publishers provide downloadfigures together with the article. We extracted and analysed the 50most viewed articles from 5 different open access oncology journals.For each of the 5 journals and also all journals combined, correlationbetween number of accesses and citations was limited (r = 0.01-0.30).Considerable variations were also observed when analyses wererestricted to specific article types such as reviews only (r = 0.21) orcase reports only (r = 0.53). Even if year of publication was takeninto account, high correlation coefficients were the exception from therule. In conclusion, downloads are not a universal surrogate forcitation figures.

Vincent Lariviere, Cassidy R. Sugimoto, Benoit Macaluso, StasaMilojevic, Blaise Cronin, Mike Thelwall (2013)
arXiv e-printsand the journal of record: An analysis of roles and relationships
arXiv.org > cs > arXiv:1306.3261, 13 Jun 2013
Abstract: Sinceits creation in 1991, arXiv has become central to the diffusion ofresearch in a number of fields. Combining data from the entirety ofarXiv and the Web of Science (WoS), this paper investigates (a) theproportion of papers across all disciplines that are on arXiv and theproportion of arXiv papers that are in the WoS, (b) elapsed timebetween arXiv submission and journal publication, and (c) the agingcharacteristics and scientific impact of arXiv e-prints and theirpublished version. It shows that the proportion of WoS papers found onarXiv varies across the specialties of physics and mathematics, andthat only a few specialties make extensive use of the repository.Elapsed time between arXiv submission and journal publication hasshortened but remains longer in mathematics than in physics. Inphysics, mathematics, as well as in astronomy and astrophysics, arXivversions are cited more promptly and decay faster than WoS papers. ThearXiv versions of papers - both published and unpublished - have lowercitation rates than published papers, although there is almost nodifference in the impact of the arXiv versions of both published andunpublished papers.

Pekka Olsbo (2013)
DoesOpenness and Open Access Policy Relate to the Success of Universities?
17th InternationalConference on ElectronicPublishing, Karlskrona, Sweden, June 13-14, 2013
full paperhttp://elpub.scix.net/data/works/att/110_elpub2013.content.01124.pdf
Extended Abstract. Introduction: The cross reading and examination ofthe reportThestate of scientific research in Finland 2012 by TheFinnish Academy andRanking Web of Universities seem to show that there could be aconnection between the internet visibility, ranking and the relativecitation impact of universities in different countries. Theserelationships can be traced back to the effectiveness of the openaccess publishing, self-archiving and Open Access policies of thecountries and the universities. This paper focuses on internetvisibility of the University of Jyväskylä and eight Europeancountries and how the openness of universities has developed duringlast two editions of the Ranking Web of Universities.

Mike Taylor (2013)
TheChallenges of Measuring Social Impact Using Altmetrics
Research Trends,Issue 33, June 2013
Abstract:Altmetrics gives us novel ways of detecting the use and consumption ofscholarly publishing beyond formal citation, and it is tempting totreat these measurements as proxies for social impact. However,altmetrics is still too shallow and too narrow, and needs to increaseits scope and reach before it can make a significant contribution tocomputing relative values for social impact. Furthermore, in order togo beyond limited comparisons of like-for-like and to become generallyuseful, computation models must take into account differentsocio-economic characteristics and legal frameworks. However, much ofthe necessary work can be borrowed from other fields, and the authorconcludes that  with certain extensions and added sophistication altmetrics will be a valuable element in calculating social reach andimpact.

Occasional series: may have missed ...

Roudabeh Torabian, Alireza Heidari, Maryam Shahrifar, Esmail Khodadi,Safar Ali Esmaeile Vardanjani (2012)
TheRelation between Self-Citation and Impact Factor in Medical ScienceOpen Access Journals in ISI & DOAJ Databases
Life Science Journal,9(4), 2206-2209, December 25, 2012
Fromthe Abstract: This research aims at investigating therelation between self-citation and impact factor in the open accessjournals indexed in ISI and DOAJ in medical science in 2007-08. In thisresearch, indexes such as the relation between self-citation of journaland impact factor and the effect of self-citation rate of the journalin open access performance are investigated. Research method is ananalytical method conducted by using citation analysis technique. SPSSstatistical software was used to examine and analyze the data and itsinferential analysis methods such as Pierson Factor were used as well.Statistical society includes 168 journals. The results showed aself-citation rate of 28% for the journal. The findings indicate thatthere is a significant relation between self-citation and impactfactor. After omitting self-citation, the level of self-citation in theperformance of journals showed that 60% of the titles in the medicalscience experienced ranking increase, 27% experienced ranking decreaseand 13% remained unchanged.

Mark J. McCabe, Christopher M. Snyder (2013)
TheRich Get Richer and the Poor Get Poorer: The Effect of Open Access onCites to Science Journals Across the Quality Spectrum
Social Science Research Network SSRN, May 25, 2013
Abstract: An open-access journal allows free online access to itsarticles, obtaining revenue from fees charged to submitting authors.Using panel data on science journals, we are able to circumvent someproblems plaguing previous studies of the impact of open access oncitations. We find that moving from paid to open access increases citesby 8% on average in our sample, but the effect varies across thequality of content. Open access increases cites to the best content(top-ranked journals or articles in upper quintiles of citations withina volume) but reduces cites to lower-quality content. We construct amodel to explain these findings in which being placed on a broadopen-access platform can increase the competition among articles forreaders attention. We can find structural parameters allowing themodel to fit the quintile results quite closely.

San FranciscoDeclaration on Research Assessment (DORA) (2013)
American Society for Cell Biology (ASCB), 17 May 2013
From DORA: There is a pressing need to improve the ways in which the output ofscientific research is evaluated by funding agencies, academicinstitutions, and other parties. ... The Journal Impact Factor isfrequently used as the primary parameter with which to compare thescientific output of individuals and institutions. The Journal ImpactFactor, as calculated by Thomson Reuters, was originally created as atool to help librarians identify journals to purchase, not as a measureof the scientific quality of research in an article. With that in mind,it is critical to understand that the Journal Impact Factor has anumber of well-documented deficiencies as a tool for researchassessment. These limitations include: A) citation distributions withinjournals are highly skewed; B) the properties of the Journal ImpactFactor are field-specific: it is a composite of multiple, highlydiverse article types, including primary research papers and reviews;C) Journal Impact Factors can be manipulated (or "gamed") by editorialpolicy; and D) data used to calculate the Journal Impact Factors areneither transparent nor openly available to the public. we make anumber of recommendations for improving the way in which the quality ofresearch output is evaluated. A number of themes run throughthese recommendations:

See also
Bruce Alberts, EditorialImpactFactor Distortions, Science, Vol. 340 no. 6134, 787, 17 May2013 DOI: 10.1126/science.1240319
Kent Anderson,ImpactCrater - Does DORA Need to Attack the Impact Factor toReform How It Is Used in Academia? The Scholarly Kitchen, May 21, 2013
Andrew Plume,SanFrancisco Declaration on Research Assessment (DORA) - Elsevier's view,Elsevier Connect, 3 June 2013; "Elsevier is not signing DORA in itsentirety, however, as its not our place to advocate for positions thatare primarily aimed at other partners in the research community.Mendeley is signing DORA on its own."

Simple and practical example

Citation analysis is specialised and difficult. To make the case for, oragainst, a claim such as 'open access increases impact' requires a lot of thereader, who may not be a specialist but who wants to try and understand thepoint at issue and decide if it has any relevance to him or her. The followingsimple example is included for this reason, not as proof but as evidence of theeffect within a particular domain. Draw your own conclusions, and then read themore detailed evidence of the bibliography if you are still interested.

"Measuring the effect for physics or astronomy is easy.Thislink returns the number of articles published in the Astrophysical Journalin 2003 and their number of citations.

"Thisnextlink shows the number of these papers which are available OA in the arXiv,and their citations.

"The result is that 75% of the papers are in the arXiv, and they represent90% of the citations, a 250% OA effect.

"By replacing ApJ with the mnemonic for any other physics or astronomyjournal one can repeat the measurement; for Nuclear Physics A (NuPhA) one getsthat 32% of the articles are in the arXiv, and they represent 78% of thecitations, a 740% OA effect."
From Michael Kurtz, American Scientist Open Access Forum, 28 September 2005http://users.ecs.soton.ac.uk/harnad/Hypermail/Amsci/4807.html
Note, the database links are 'live', i.e. they return the current databasefigures, not the exact figures on which Michael Kurtz would have based hiscalculations, but the percentages quoted are unlikely to change dramatically,in the short term at least.

Elucidation of calculation (by Stevan Harnad, figures valid on 22 July 2007)
For ApJ:
TOT: articles 2592 citations 70732
Arx: articles 1943 citations 62586 c/a 32.21 (rounded to 32)
Non: articles 649 citations 8146 c/a 12.55 (rounded to 13)
Then 32/13 = 2.5 (250%)

For NuPhA:
TOT: articles 1134 citations 4451
Arx: articles 344 citations 3225 c/a 9.375
Non: articles 790 citations 1226 c/a 1.552
Then 9.375/1.553 = 6.041 (600%)

Michael Kurtz comments: "The differences in (NuPhA: 740% to 600% effect)results are because the database has changed over the past two years since Idid it. There is a systematic error in the calculations for Nuclear Physics A(Elsevier does not give us the references) so the results will be higher thanthe true value. Physical Review C (Nuclear Physics) has an OA advantage numberof 221%, the systematic in this case is small and in the other direction."

Studies with original data

Highlights

Reviews of OA impact studies

Lawrence (2001) was the first to publish datarecognising the trend for online publication to increase impact, confirmed foropen access papers by the work of the Open Citation Project based on arXiv(e.g.Harnad and Brody,D-Lib, 2004), andby Kurtzet al. (2004a,2003a) looking at the NASA Astrophysics Data System.Commenting on Harnad and Brody (D-Lib, 2004) inOpen Access News,PeterSuber said:
This is an important article. It's the first major study since the famous Lawrence paper documenting the proposition that OA increases impact. It's also the first to go beyond Lawrence in scope and method in order to answer doubts raised about his thesis. By confirming that OA increases impact, it gives authors the best of reasons to provide OA to their own work (21 June 2004)
Broader collaborations have emerged to extend these findings (e.g.Brodyet al. 2004).

Open access has become feasible because of the move towards onlinepublication and dissemination. A new measure that becomes possible with onlinepublication is the number of downloads or 'hits', opening a new line ofinvestigation. Brodyet al. have been prominent in showing there is acorrelation between higher downloads and higher impact, particularly for highimpact papers, holding out the promise not just for higher impact resultingfrom open access but for the ability to predict high impact papers muchearlier, not waiting years for those citations to materialise (e.g.Brody and Harnad 2005). The effect can be verifiedwith theCorrelation Generator (below).

(Note. The latest listings might include preprints, or evenpre-preprints. This area of study is effectively a work in progress, and assuch the list is intended to raise awareness of the most recent results, evenwhere these may not be definitive or final versions. Check back for definitiveversions.)

Cassidy R. Sugimoto, Mike Thelwall, Vincent Larivière, Andrew Tsou,Philippe Mongeon, Benoit Macaluso (2013)
ScientistsPopularizing Science: Characteristics and Impact of TED Talk Presenters
PLoS ONE,8(4): e62403, April 30, 2013
doi:10.1371/journal.pone.0062403
Abstract: The TED (Technology, Entertainment, Design) conference and associatedwebsite of recorded conference presentations (TED Talks) is a highlysuccessful disseminator of science-related videos, claiming over abillion online views. Although hundreds of scientists have presented atTED, little information is available regarding the presenters, theiracademic credentials, and the impact of TED Talks on the generalpopulation. This article uses bibliometric and webometric techniques togather data on the characteristics of TED presenters and videos andanalyze the relationship between these characteristics and thesubsequent impact of the videos. The results show that the presenterswere predominately male and non-academics. Male-authored videos weremore popular and more liked when viewed on YouTube. Videos by academicpresenters were more commented on than videos by others and were moreliked on YouTube, although there was little difference in howfrequently they were viewed. The majority of academic presenters weresenior faculty, males, from United States-based institutions, werevisible online, and were cited more frequently than average for theirfield. However, giving a TED presentation appeared to have no impact onthe number of citations subsequently received by an academic,suggesting that although TED popularizes research, it may not promotethe work of scientists within the academic community.
See also
Sara Grossman,Givinga TED Talk? Expect More Visibility, but Not More Citations,Chronicle of Higher Education,June 19, 2013

Xianwen Wang, Wenli Mao, Shenmeng Xu, Chunbo Zhang (2013)
Attention HistoryOver Time of Scientific Literature: Metrics of Nature Publications
arXiv.org > cs > arXiv:1304.7653, 29 Apr 2013
Abstract: In this study, we report findings about patterns of the Nature metricspage views over time. Using the page views data of papers published onNature, we calculate from two perspectives. The first one is the timebefore their page views reach 50%/80% of the total, and the second oneis the percentage of total page views in 7 days, 30 days, and 100 daysafter publication. Papers are viewed most frequently within a shorttime period after publication. Respectively, 62.16% and 100% of thepapers on Nature are viewed more than half of their total times in thefirst week and month. 52.48% of the papers gain more than 80% of theirtotal views in the following month. Meanwhile, the page views numberreaches more than 52% of the total in the first week and more than 72%in the first month. In addition, we find that readers' attention onOpen Access publications are more enduring. Using the usage data of anewly published paper, we conduct regression analysis to predict thefuture expected total usage count of the paper.

Teja Koler-Povh, Goran Turk and Primo Juni (2013)
Doesthe Open Access Business Model Have a Significant Impact on theCitation of Publications? Case Study in the Field of Civil Engineering
Proceedings of the FifthBelgrade International Open Access Conference 2012, 26April 2013
DOI: 10.5937/BIOAC-68
Original conference slide presentation, 18 May 2012http://boac.ceon.rs/public/site/Koler-Povh_Juznic_Turk.pdf
From the abstract: we have chosen to analyze the publications in threeinternational journals in the field of civil engineering. All of themhave an ISI impact factor in the Civil engineering subject category inthe ISI/Web of science database (WOS). The articles were classifiedinto two groups - the OA publications and the non-OA publications. Weanalyzed all the articles published in the same year and the number oftheir citations until the end of February 2012, seeking to find out ifthese two groups differ from each other. From the conclusion: It was found that OA significantly influenced the citation counts for thearticles published in the Computers & Structures journal, whichis ranked in the first quarter - according to both databases. Only the GSdatabase showed a significant effect of OA on citations for thearticles published in the Journal of Computing in Civil Engineering.Neither GS nor WOS databases indicate a significant effect of OA on thecitation counts of articles in the Automation in Construction journal.These two journals are ranked in the second quarter among 88 journalsin the same subject category, civil engineering. The present resultsindicate that more research is needed to give a final answer to theprinciple question of the paper: does open access have a significantimpact on citations in the field of civil engineering. Some otherpotentially influential factors will be tested as well.

Vincent Larivière, George A. Lozano, Yves Gingras (2013)
Are elitejournals declining?
arXiv.org > cs > arXiv:1304.6460, 24 Apr 2013
Previous work indicates that over the past 20 years, the highest quality workhave been published in an increasingly diverse and larger group ofjournals. In this paper we examine whether this diversification hasalso affected the handful of elite journals that are traditionallyconsidered to be the best. We examine citation patterns over the past40 years of 7 long-standing traditionally elite journals and 6 journalsthat have been increasing in importance over the past 20 years. To beamong the top 5% or 1% cited papers, papers now need about twice asmany citations as they did 40 years ago. Since the late 1980s and early1990s elite journals have been publishing a decreasing proportion ofthese top cited papers. This also applies to the two journals that aretypically considered as the top venues and often used as bibliometricindicators of "excellence", Science and Nature. On the other hand,several new and established journals are publishing an increasingproportion of most cited papers. These changes bring new challenges andopportunities for all parties. Journals can enact policies to increaseor maintain their relative position in the journal hierarchy.Researchers now have the option to publish in more diverse venuesknowing that their work can still reach the same audiences. Finally,evaluators and administrators need to know that although there willalways be a certain prestige associated with publishing in "elite"journals, journal hierarchies are in constant flux so inclusion ofjournals into this group is not permanent.
See also
Hadas Shema,Elitejournals: to hell in a handbasket? Scientific American blogs, May 2, 2013

Paola Bongiovani, Sandra Miguel, Nancy Diana Gómez
Accesoabierto, impacto científico y la producción científica en dosuniversidades argentinas en el campo de la medicina (OpenAccess, scientific impact and the scientific production in twoArgentine universities in the field of medicine)
Revista Cubana deInformación en Ciencias de la Salud, Vol 24, No 2, 2013
Abstract: this paper studies the scientific production published by researchersat two argentine universities (National University of La Plata andNational University of Rosario) in the discipline of medicine.Objective: to establish the volume and evolution of scientificproduction published in open access journals and in subscriptionjournals that allow self-archiving in repositories. Methods: The scientific production for both institutions was determined by taking asample from Scopus and covers the period 2006-2010. It applies amethodology based on the analysis of the access models of journals usedby researchers to publish their articles established through searchesperformed using Romeo-Sherpa, Dulcinea, DOAJ, SciELO, RedALyC andPubMed Central. Additionally, the study explores the citation levels ofarticles from both institutions according to access models of journals,comparing impact indicators from average citation per article. Results:The two institutions generally show similar patterns to those found atnational level, although UNR, following international trends inMedicine, has a higher percentage of articles published in open accessjournals. In both cases, about half of the production could bedeposited in repositories, being pre-print versions and the author'spost print mostly allowed by editors. Conclusions: From the perspectiveof the impact levels achieved, the results indicate a higher level ofcitation in subscription journals with self-archiving permissions, andthis is encouraging for the promotion and development of institutionalrepositories in both universities.

Xin Shuai, Zhuoren Jiang, Xiaozhong Liu, Johan Bollen (2013)
AComparative Study of Academic impact and Wikipedia Ranking
Author preprint.Joint Conference onDigital Libraries JCDL 2013, Indianapolis, IN, July 22-26,2013, to be presented
Abstract: In addition to its broad popularity Wikipedia is also widelyused for scholarly purposes. Many Wikipedia pages pertain to academicpapers, scholars and topics providing a rich ecology for scholarlyuses. Although many recognize the scholarly potential of Wikipedia, asa crowdsourced encyclopedia its authority and quality is questioned dueto the lack of rigorous peer-review and supervision. Scholarlyreferences and mentions on Wikipedia may thus shape the societalimpact of a certain scholarly communication item, but it is not clearwhether they shape actual academic impact. In this paper we comparethe impact of papers, scholars, and topics according to two differentmeasures, namely scholarly citations and Wikipedia mentions. Ourresults show that academic and Wikipedia impact are positivelycorrelated. Papers, authors, and topics that are mentioned on Wikipediahave higher academic impact than those are not mentioned. Our findingsvalidate the hypothesis that Wikipedia can help assess the impact ofscholarly publications and underpin relevance indicators for scholarlyretrieval or recommendation systems.

Heather Piwowar, Todd J Vision (2013)
Data reuse andthe open data citation advantage
PeerJ PrePrints, v1, 4 Apr 2013
doi: 10.7287/peerj.preprints.1
From the Abstract: Previous studies have found that papers withpublicly available datasets receive a higher number of citations thansimilar studies without available data. However, few previous analyseshave had the statistical power to control for the many variables knownto predict citation rate, which has led to uncertain estimates of thecitation boost. Furthermore, little is known about patterns in datareuse over time and across datasets. METHOD AND RESULTS: Here, we lookat citation rates while controlling for many known citation predictors,and investigate the variability of data reuse. In a multivariateregression on 10,555 studies that created gene expression microarraydata, we found that studies that made data available in a publicrepository received 9% (95% confidence interval: 5% to 13%) morecitations than similar studies for which the data was not madeavailable. Date of publication, journal impact factor, open accessstatus, number of authors, first and last author publication history,corresponding author country, institution citation history, and studytopic were included as covariates. The citation boost varied with dateof dataset deposition: a citation boost was most clear for paperspublished in 2004 and 2005, at about 30%. ... CONCLUSION: Afteraccounting for other factors affecting citation rate, we find a robustcitation benefit from open data, although a smaller one than previouslyreported. We conclude there is a direct effect of third-party datareuse that persists for years beyond the time when researchers havepublished most of the papers reusing their own data.

Philip M. Davis (2013)
Publicaccessibility of biomedical articles from PubMed Central reducesjournal readership - retrospective cohort analysis
The FASEB Journal,Federation of American Societies for Experimental Biology, April 3, 2013
doi: 10.1096/fj.13-229922
Abstract: Does PubMed Centrala government-run digital archive ofbiomedical articlescompete with scientific society journals? Alongitudinal, retrospective cohort analysis of 13,223 articles (5999treatment, 7224 control) published in 14 society-run biomedicalresearch journals in nutrition, experimental biology, physiology, andradiology between February 2008 and January 2011 reveals a 21.4%reduction in full-text hypertext markup language (HTML) articledownloads and a 13.8% reduction in portable document format (PDF)article downloads from the journals websites when U.S. NationalInstitutes of Health-sponsored articles (treatment) become freelyavailable from the PubMed Central repository. In addition, the effectof PubMed Central on reducing PDF article downloads is increasing overtime, growing at a rate of 1.6% per year. There was no longitudinaleffect for full-text HTML downloads. While PubMed Central may beproviding complementary access to readers traditionally underserved byscientific journals, the loss of article readership from the journalwebsite may weaken the ability of the journal to build communities ofinterest around research papers, impede the communication of news andevents to scientific society members and journal readers, and reducethe perceived value of the journal to institutional subscribers.

David J Solomon, Mikael Laakso, Bo-Christer Björk (2013)
Alongitudinal comparison of citation rates and growth among open accessjournals
Author preprint, 27 March 2013.InJournalof Informetrics, accepted for publication
Abstract: The study documents the growth in the number of journals andarticles along with the increase in normalized citation rates of openaccess (OA) journals listed in the Scopus bibliographic databasebetween 1999 and 2010. Longitudinal statistics on growth injournals/articles and citation rates are broken down by funding model,discipline, and whether the journal was launched or had converted toOA. The data were retrieved from the web sites of SCIMago Journal andCountry Rank (journal /article counts), JournalM3trics (SNIP2 values),Scopus (journal discipline) and Directory of Open Access Journals(DOAJ) (OA and funding status). OA journals/articles have grown muchfaster than subscription journals but still make up less that 12% ofthe journals in Scopus. Two-year citation averages for journals fundedby article processing charges (APCs) have reached the same level assubscription journals. Citation averages of OA journals funded by othermeans continue to lag well behind OA journals funded by APCs andsubscription journals. We hypothesize this is less an issue of qualitythan due to the fact that such journals are commonly published inlanguages other than English and tend to be located outside the fourmajor publishing countries.

Jerome Vanclay (2013)
Factorsaffecting citation rates in Environmental Science
ePublications@SCU, Southern Cross University, 2013. InJournal of Informetrics, Vol 7, No 2, April 2013, 265-271http://www.sciencedirect.com/science/article/pii/S1751157712000995
http://dx.doi.org/10.1016/j.joi.2012.11.009
Abstract: Analysis of 131 publications during 2006-07 by staff of theSchool of Environmental Science and Management at Southern CrossUniversity reveals that the journal impact factor, article length andtype (i.e., article or review), and journal self-citations affect thecitations accrued to 2012. Authors seeking to be well cited should aimto write comprehensive and substantial review articles, and submit themto journals with a high impact factor which has previously carriedarticles on the topic. Nonetheless, strategic placement of articles iscomplementary to, and no substitute for careful crafting of goodquality research. Evidence remains equivocal regarding the contributionof an authors prior publication success (h-index) and of open-accessjournals.

Caitlin Rivers (2013)
Scholarly impact of open access journals
18 March 2013 (originally written on 26 January 2013)
Data sources for this work also available on Figsharehttp://figshare.com/articles/Open_access_journal_impacts/154346
Extracts: I downloaded a list of open access journals from theDirectory of Open Access Journals (DOAJ). I also downloaded aspreadsheet of 2011 impact data from Journal Metrics, an offshoot ofScopus that assesses journal impact. Journal Metrics provides twoimpact measures: Source Normalized Impact per Paper (SNIP) and SCImagoJournal Rank (SGR). ... In terms of impact, open access still lagsbehind non-OA journals. The mean SNIP of non-OA journals in 2011 was0.83 with a max of 41, while OA journals had a mean SNIP of .57 and amax of less than 5. The mean SJR of non-OA was .64 (max of 36), and theOA mean was .38 (max of 7.6).

Mark J. McCabe, Christopher M. Snyder (2013)
DoesOnline Availability Increase Citations? Theory and Evidence from aPanel of Economics and Business Journals
Social Science Research Network (SSRN), 14 March 2013, also at http://mccabe.people.si.umich.edu/McCabe_Snyder_Revised_3_2013.pdf. Submitted toReview of Economics andStatistics.
Revised preprint. Abstract: Does online availability boost citations? Theanswer has implications for issues ranging from the value of a citationto the sustainability of open-access journals. Using panel data oncitations to economics and business journals, we show that the enormouseffects found in previous studies were an artifact of their failure tocontrol for article quality, disappearing once we add fixed effects ascontrols. The absence of an aggregate effect masks heterogeneity acrossplatforms: JSTOR stands apart from others, boosting citations around10%. We examine other sources of heterogeneity including whether JSTORincreases cites from authors in developing more than developedcountries and increases cites to long-tail more than superstararticles. Our theoretical analysis informs the econometricspecification and allows us to translate our results for citationincreases into welfare terms.

Richard C. Doty (2013)
Tenure-TrackScience Faculty and the 'Open Access Citation Effect'
Journal of Librarianshipand Scholarly Communication, 1 (3), Mar 2013
info:doi/10.7710/2162-3309.1052
From the Abstract: The observation that open access (OA) articlesreceive more citations than subscription-based articles is known as theOA citation effect (OACE). Implicit in many OACE studies is the beliefthat authors are heavily invested in the number of citations theirarticles receive. This study seeks to determine what influence the OACEhas on the decision-making process of tenure-track science faculty whenthey consider where to submit a manuscript for publication. METHODSFifteen tenure-track faculty members in the Departments of Biology andChemistry at the University of North Carolina at Chapel Hillparticipated in semi-structured interviews employing a variation of thecritical incident tecnique. RESULTS Seven of the fifteen facultymembers said they would consider making a future articlefreely-available based on the OACE. Due to dramatically differentexpectations with respect to the size of the OACE, however, only one ofthem is likely to seriously consider the OACE when deciding where tosubmit their next manuscript for publication. DISCUSSION Journalreputation and audience, and the quality of the editorial and reviewprocess are the most important factors in deciding where to submit amanuscript for publication. Once a subset of journals has satisfiedthese criteria, financial and access issues compete with the OACE inmaking a final decision.

Mazda Farshad, Claudia Sidler, Christian Gerber (2013)
Associationof scientific and nonscientific factors to citation rates of articlesof renowned orthopedic journals
European Orthopaedicsand Traumatology, March 2013
info:doi/10.1007/s12570-013-0174-6
From the Abstract: This investigation studied associations ofscientific and nonscientific criteria with the citation frequency ofarticles in two top-ranked international orthopedic journals. Methods:The 100 most (mean, 88 citations/5 years for cases) and 100 least(mean, two citations/5 years for controls) cited articles publishedbetween 2000 and 2004 in the Journal of Bone and Joint Surgery and theBone & Joint Journal (formerly known as JBJS (Br)), two of themost distributed general orthopedic journals, were identified. Theassociation of scientific and nonscientific factors on their citationrate was quantified. Results: Randomized controlled trials, as well asmulticenter studies with large sample sizes, were significantly morefrequent in the high citation rate group. The unadjusted odds of ahighly cited article to be supported by industry were 2.8 (95 %confidence interval 1.5, 5.6; p<0.05) if compared with a lowlycited article.

Steffen Bernius, Matthias Hanauske, Berndt Dugall, and Wolfgang Konig(2013)
Exploringthe Effects of a Transition to Open Access: Insights from a SimulationStudy
Goethe University Frankfurt, Faculty of Economics and BusinessAdministration, 2012. InJournalof the American Society for Information Science and Technology,Vol. 64, No. 4, 701-726, April 2013, online: 21 Feb 2013
DOI: 10.1002/asi.22772
Abstract: The Open Access (OA) movement, which postulates gratis andunrestricted online access to publicly funded research findings, hassignificantly gained momentum in recent years. The two ways ofachieving OA are self-archiving of scientific work by the authors(Green OA) and publishing in OA journals (Gold OA). But there is stillno consensus which model should be supported in particular. The aim ofthis simulation study is to discover mechanisms and predictdevelopments that may lead to specific outcomes of possible markettransformation scenarios. It contributes to theories related to OA bysubstantiating the argument of a citation advantage of OA articles andby visualizing the mechanisms of a journal system collapsing in thelong-term due to the continuation of the serials crisis. The practicalcontribution of this research stems from the integration of all marketplayers: Decisions regarding potential financial support of OA modelscan be aligned with our findings  as well as the decision of apublisher to migrate his journals to Gold OA. Our results indicate thatfor scholarly communication in general, a transition to Green OAcombined with a certain level of subscription-based publishing and amigration of few top journals is the most beneficial development.

Christopher Hassall (2013)
Goinggreen: self-archiving as a means for dissemination of research outputin ecology and evolution
Ideas in Ecology and Evolution, 5 (2), Feb 2013
info:doi/10.4033/iee.v5i2.4555
Abstract: There is a perception that is prevalent within the academiccommunity that access to information is being restricted by the largepublishing houses that dominate academic publishing. However,self-archiving policies that are promoted by publishers provide amethod by which this restriction can be relaxed. In this paper Ioutline the motivation behind self-archiving publications in terms ofincreased impact (citations and downloads of articles), increasedaccess for the developing world, and decreased library costs. I thendescribe the current state of self-archiving policies in 165 ecologyand evolution journals. I demonstrate that the majority (52%) of paperspublished in 2011 could have been self-archived in a format close totheir final form. Journals with higher impacts tend to have morerestrictive policies on self-archiving, and publishers vary in theextent to which they impose these restrictions. Finally, I provide aguide to academics on how to take advantage of opportunities forself-archiving using either institutional repositories orfreely-available online tools.

Sayed-Amir Marashi, Seyed Mohammad Amin Hosseini-Nami, KhadijehAlishah, Mahdieh Hadi, Ali Karimi, Saeedeh Hosseinian, RouhallahRamezaniFard, Reihaneh Sadat Mirhassani, Zhaleh Hosseini, Zahra Shojaie(2013)
Impactof Wikipedia on Citation Trends
EXCLI Journal(Experimental and Clinical Sciences International online journal foradvances in science), Vol. 12, 15-19, January 15, 2013,SupplementaryInformation (dataset, methodology)
Guest editorial, Abstract: It has been suggested that the "visibility" of anarticle influences its citation count. More specifically, it isbelieved that the social media can influence article citations. Here wetested the hypothesis that inclusion of scholarly references inWikipedia affects the citation trends. To perform this analysis, weintroduced a citation "propensity" measure, which is inspired by theconcept of amino acid propensity for protein secondary structures. Weshow that although citation counts generally increase during time,the citation "propensity" does not increase after inclusion of areference in Wikipedia.

Christian Gumpenberger, Maria-Antonia Ovalle-Perandones, and JuanGorraiz (2012)
Onthe impact of Gold Open Access journals
u:scholar, Universität Wien, November 2012. InScientometrics
info:doi/10.1007/s11192-012-0902-7
Abstract: Gold Open Access (=Open Access publishing) is for many thepreferred route to achieve unrestricted and immediate access toresearch output. However, true Gold Open Access journals are stilloutnumbered by traditional journals. Moreover availability of Gold OAjournals differs from discipline to discipline and often leavesscientists concerned about the impact of these existent titles. Thisstudy identified the current set of Gold Open Access journals featuringa Journal Impact Factor (JIF) by means of Ulrichsweb, Directory of OpenAccess Journals and Journal Citation Reports (JCR). The results wereanalyzed regarding disciplines, countries, quartiles of the JIFdistribution in JCR and publishers. Furthermore the temporal impactevolution was studied for a Top 50 titles list (according to JIF) bymeans of Journal Impact Factor, SJR and SNIP in the time interval20002010. The identified top Gold Open Access journals proved to bewell-established and their impact is generally increasing for all theanalyzed indicators. The majority of JCR-indexed OA journals can beassigned to Life Sciences and Medicine. The success-rate for JCRinclusion differs from country to country and is often inverselyproportional to the number of national OA journal titles. Compiling alist of JCR-indexed OA journals is a cumbersome task that can only beachieved with non-Thomson Reuters data sources. A correspondingautomated feature to produce current lists on the fly would bedesirable in JCR in order to conveniently track the impact evolution ofGold OA journals.

Ling-Ling Wu, Mu-Hsuan Huang, and Ching-Yi Chen (2012)
Citationpatterns of the pre-web and web-prevalent environments: The moderatingeffects of domain knowledge
Journal of the AmericanSociety for Information Science and Technology, 63 (11),2182-94, November 2012
info:doi/10.1002/asi.22710
Abstract: The Internet has substantially increased the onlineaccessibility of scholarly publications and allowed researchers toaccess relevant information efficiently across different journals anddatabases. Because of online accessibility,academic researchers tend to read more, and reading has become moresuperficial, such that information overloadhas become an important issue. Given this circumstance, how theInternet affects knowledge transfer, or, more specifically, thecitation behavior of researchers, has become a recent focus ofinterest. This study assesses the effects of the Internet on citationpatterns in terms of 4 characteristics of cited documents: topicrelevance, author status, journal prestige, and age of references. Thiswork hypothesizes that academic scholars cite more topically relevantarticles, more articles written by lower status authors, articlespublished in less prestigious journals, and older articles with onlineaccessibility. The current study also hypothesizes that researcherknowledge level moderates such Internet effects. We chose the IT andGroup subject area and collected 241 documents published in thepre-web period (1991-1995) and 867 documents published in theweb-prevalent period (2006-2010) in the Web of Science database. Thereferences of these documents were analyzed to test the proposedhypotheses, which are significantly supported by the empirical results.

V. Calcagno, E. Demoinet, K. Gollner, L. Guidi, D. Ruths, C. deMazancourt (2012)
Flowsof Research Manuscripts Among Scientific Journals Reveal HiddenSubmission Patterns
Science, 11October 2012
info:doi/10.1126/science.1227833
Abstract: The study of science-making is a growing discipline thatbuilds largely on online publication and citation databases, whileprepublication processes remain hidden. Here, we report results from alarge-scale survey of the submission process, covering 923 scientificjournals from the biological sciences in years 2006-2008. Manuscriptflows among journals revealed a modular submission network, withhigh-impact journals preferentially attracting submissions. However,about 75% of published articles were submitted first to the journalthat would publish them, and high-impact journals publishedproportionally more articles that had been resubmitted from anotherjournal. Submission history affected postpublication impact:Resubmissions from other journals received significantly more citationsthan first-intent submissions, and resubmissions between differentjournal communities received significantly fewer citations.
See also
Philip Ball,Rejectionimproves eventual impact of manuscripts,Nature News, 11Oct 2012: Just had your paper rejected? Dont worry - that might boostits ultimate citation tally. An excavation of scientific papers'usually hidden prepublication trajectories from journal to journal hasfound that papers published after having first been rejected elsewherereceive significantly more citations on average than ones accepted onfirst submission.
Ruth Williams,TheBenefits of Rejection,TheScientist, 11 Oct 2012: A survey of theprepublication histories of papers reveals that manuscripts that arerejected then resubmitted are cited more often.
From Vincent Calcagno research:
Moreabout submission flows, October 19, 2012: Mail contact toobtain the raw data file
Thebenefits of rejection, continued, October 23, 2012: On Figure4A, "One result that attracts considerable attention in our article onSubmission Flows"

Mikael Laakso and Bo-Christer Björk (2012)
DelayedOpen Access - an overlooked high-impact category of openly availablescientific literature
HARIS (Hanken Research Information System), 10 October 2012.Journal of the American Societyfor Information Science and Technology, published online 23 May 2013 http://onlinelibrary.wiley.com/doi/10.1002/asi.22856/abstract
DOI: 10.1002/asi.22856
Abstract: Delayed open access (OA) refers to scholarly articles insubscription journals made available openly on the web directly throughthe publisher at the expiry of a set embargo period. Though asubstantial number of journals have practiced delayed OA since theystarted publishing e-versions, empirical studies concerning open accesshave often overlooked this body of literature. This study providescomprehensive quantitative measurements by identifying delayed OAjournals, collecting data concerning their publication volumes, embargolengths, and citation rates. Altogether 492 journals were identified,publishing a combined total of 111 312 articles in 2011. 77,8 % ofthese articles were made open access within 12 months from publication,with 85,4 % becoming available within 24 months. A journal impactfactor analysis revealed that delayed OA journals have on average twiceas high average citation rates compared to closed subscriptionjournals, and three times as high as immediate OA journals. Overall theresults demonstrate that delayed OA journals constitute an importantsegment of the openly available scholarly journal literature, both bytheir sheer article volume as well as by including a substantialproportion of high impact journals.

Philip Davis (2012)
TheEffect of Public Deposit of Scientific Articles on Readership
The Physiologist,55 (5), 161-5, October 2012
Note, this link will download the whole journal issue, not just thecited paper. Abstract: A longitudinal cohort analysis of 3,499 articlespublished in 12 physiology journals reveals a 14% reduction in fulltext article downloads when they are made publicly available from thePubMed Central archive. The loss of article readership from the journalwebsite may weaken the ability of the publisher to build communities ofinterest around the research article, impede the communication of newsand events with society members and reduce the perceived value of thejournal to institutional subscribers.
See also
Philip Davis,IsPubMed Central Complementing or Competing with Journal Publishers?Scholarly Kitchen, September 20, 2012

G Mahesh (2012)
Openaccess and impact factors
Current Science,103 (6), 610, 25 September 2012
Correspondence. Extract: The case-in-point is CSIR-NISCAIR journals.The institute publishes 17 primary journals and on the FirstInternational Open Access Day on 14 October 2008, NISCAIR made two ofits journals open access and by mid-2009 all its journals wereavailable in this mode. Going by the recently released Journal CitationReports, for the first time two CSIR-NISCAIR journals have crossed IF1, and as shown in Figure 1, almost all journals have increased theirimpact factors in 2011 over the previous years. It appears that theincreased 2011 IFs are a result of the journals having gone open accessfrom 2008 to 2009 onwards.

Filippo Radicchi (2012)
In science "thereis no bad publicity": Papers criticized in technical comments have highscientific impact
arXiv.org > physics > arXiv:1209.4997, 22September 2012
From the abstract: Technical comments are special types of scientificpublications whose aim is to correct or criticize previously publishedpapers. Often, comments are negatively perceived by the authors of thecriticized articles because believed to make the commented papers lessworthy or trusty to the eyes of the scientific community. Thus, thereis a tendency to think that criticized papers are predestined to havelow scientific impact. We show here that such belief is not supportedby empirical evidence. We consider thirteen major publication outletsin science and perform a large-scale analysis of the citation patternsof criticized publications. We find that commented papers have not onlyaverage citation rates much higher than those of non commentedarticles, but also unexpectedly over-populate the set of the most citedpublications within a journal. Since comments are published soon aftercriticized papers, comments can be viewed as early indicators of thefuture impact of criticized papers.

Sara Pérez Álvarez, Felipe P. Álvarez Arrieta, and Isidro F. Aguillo (2012)
EUFP7 research in Open Access Repositories
Digital.CSIC, the Institutional Repository of the Spanish NationalResearch Council (CSIC), 21 Aug 2012. In17th International Conference onScience and Technology Indicators (STI), 5-8 September 2012, Montreal, http://sticonference.org/Proceedings/vol1/Alvarez_EU_58.pdf
Abstract: Open access repositories are a reliable source of academicitems that can be used for testing the capabilities of the webometricanalysis. This paper deals with actions needed for extracting webindicators from bibliographic records in open access repositories,provides guidelines to support a further webometric study and presentsthe results of a preliminary web impact evaluation carried out over asample of 1386 EU FP7 output papers available from the OpenAIREdatabase. The European Commission project OpenAIRE aims, among otherobjectives, to provide impact measures to assess the researchperformance from repositories contents and, especially, of SpecialClause 39 project participants within EU FP7. Using URL citations,title mentions and copies of titles as main web impact indicators, thisstudy suggests that a priori the implementation of the mandatory clauseSC39 to encourage open access to European research may be resultedindeed in a greater and more immediate web visibility of these papers.

Melissa Terras (2012)
TheImpact of Social Media on the Dissemination of Research: Results of anExperiment
Journal of DigitalHumanities, 1 (3), September 2012
Three collected blog posts, revised with a new introduction for thisjournal presentation: The first, What Happens When You Tweet anOpen-Access Paper discusses the correlation between talking about anindividual paper online, and seeing its downloads increase. The second,Is Blogging and Tweeting About Research Papers Worth It? The Verdictdiscusses the overall effect of this process on all my papers,highlighting what I think the benefits of open access are. In the finalpost, When Was the Last Time You Asked How Your Published Research WasDoing? I talk about the link between publishers and open access, andhow little we know about how often our research is accessed once it ispublished.

Xianwen Wang, Zhi Wang, and Shenmeng Xu (2012)
Tracingscientists' research trends realtimely
arXiv.org > cs > arXiv:1208.1349, 07 Aug 2012InScientometrics, 95 (2):717-729, 5 May 2013doi: 10.1007/s11192-012-0884
From the Abstract: In this research, we propose a method to tracescientists' research trends realtimely. By monitoring the downloads ofscientific articles in the journal of Scientometrics for 744 hours,namely one month, we investigate the download statistics. Then weaggregate the keywords in these downloaded research papers, and analyzethe trends of article downloading and keyword downloading.

Maged Boulos and Patricia Anderson (2012)
Preliminarysurvey of leading general medicine journals use of Facebook and Twitter
Journal of the CanadianHealth Libraries Association, 33 (02), 38-47, 01 Aug 2012
info:doi/10.5596/c2012-010
From the Abstract: Methods: We selected the top 25 general medicinejournals on the Thomson Reuters Journal Citation Report (JCR) list. Wesurveyed their Facebook and Twitter presences and scanned their Websites for any Facebook and (or) Twitter features as of November 2011.Results/Discussion: 20 of 25 journals had some sort of Facebookpresence, with 11 also having a Twitter presence. Total Likes acrossall of the Facebook pages for journals with a Facebook presence were321,997, of which 259, 902 came from the New England Journal ofMedicine (NEJM) alone. The total numbers of Twitter Followers weresmaller by comparison when compiled across all surveyed journals.Likes and Followers are not the equivalents of total accesses butprovide some proxy measure for impact and popularity. Those journals inour sample making best use of the open sharing nature of social mediaare closed-access; with the leading open access journals on the listlagging behind by comparison.
See also this similar finding: Sandra L De Groote,Promotinghealth sciences journal content with Web 2.0: A snapshot in time,First Monday,Vol 17, No 8, 6 August 2012: "Traditional journals were more likely touse Web 2.0 technology than open access journals."

Bo-Christer Björk and David Solomon (2012)
Openaccess versus subscription journals: a comparison of scientific impact
BMC Medicine,10 (1), 17 Jul 2012
info:doi/10.1186/1741-7015-10-73
From the Abstract: In the past few years there has been an ongoingdebate as to whether the proliferation of open access (OA) publishingwould damage the peer review system and put the quality of scientificjournal publishing at risk. Our aim was to inform this debate bycomparing the scientific impact of OA journals with subscriptionjournals, controlling for journal age, the country of the publisher,discipline and (for OA publishers) their business model. A total of 610OA journals were compared with 7,609 subscription journals using Web ofScience citation data while an overlapping set of 1,327 OA journalswere compared with 11,124 subscription journals using Scopus data.Overall, average citation rates, both unweighted and weighted for thenumber of articles per journal, were about 30% higher for subscriptionjournals. However, after controlling for discipline (medicine andhealth versus other), age of the journal (three time periods) and thelocation of the publisher (four largest publishing countries versusother countries) the differences largely disappeared in mostsubcategories except for journals that had been launched prior to 1996.OA journals that fund publishing with article processing charges (APCs)are on average cited more than other OA journals. In medicine andhealth, OA journals founded in the last 10 years are receiving about asmany citations as subscription journals launched during the same period.
See also
Openaccess means business: Pay-to-publish approaching same impact factor assubscription journals, Science Codex, 17 Jul 2012
David Solomon and Bo-Christer Björk,Opinion:OA Coming of Age,TheScientist, 06 Aug 2012

Bertil Dorch (2012)
On theCitation Advantage of linking to data
hprints.org, Nordic Arts and Humanities and Social Sciences e-printrepository, 05 Jul 2012
Abstract: This paper present some indications of the existence of aCitation Advantage related to linked data, using astrophysics as acase. Using simple measures, I find that the Citation Advantagepresently (at the least since 2009) amounts to papers with links todata receiving on the average 50% more citations per paper per year,than the papers without links to data. A similar study by other authorsshould a cummulative effect after several years amounting to 20%.Hence, a Data Sharing Citation Advantage seems inevitable.

Adam Eyre-Walker (2012)
Canwe assess the quality and impact of science? (video)
YouTube, 02 July 2012
In International workshop on Evolution in the Time of Genomics - part08, May 2012, Venice. Quoted extracts: "(F1000) assessors are highlyinfluenced by the impact factor of the journal and less by the actualintrinsic quality of the paper" "Without the information of whatjournal that paper is published in you have very little power to tellthe ultimate impact of that paper" "People tend to over-rate thequality of science in high-impact factor journals and that is a veryimportant influence" "The difference between the really high impactjournals and the medium quality journals is nothing like as dramatic aswe might think" There is a brief mention of open access impact only inquestions following the presentation - assumes open access journals.

Sharon Mathelus, Ginny Pittman, and Jill Yablonski-Crepeau (2012)
Promotion ofresearch articles to the lay press: a summary of a three-year project
Learned Publishing,Vol 25, No 3, July 2012, 207-212
Abstract: The promotion of scholarly journal articles to journalistsand bloggers via the dissemination of press releases generates apositive impact on the number of citations that publicized journalarticles receive. Research by John Wiley & Sons, Inc. showsthat article-level publicity efforts and media coverage boostsdownloads by an average of 1.8 times and were found to increasecitations by as much as 2.0-2.2 times in the articles analyzed in thisstudy. We evaluated scholarly journal articles published in nearly 100Wiley journals, which were also covered in 296 press releases. Theresults in this case study suggest a need for greater investment inmedia support for scholarly journals publishing research that sparksinterest to a broad news audience, as it could increase citations.

M. Riera and E. Aibar (2012)
Doesopen access publishing increase the impact of scientific articles? Anempirical study in the field of intensive care medicine
Medicina intensiva /Sociedad Espanola de Medicina Intensiva y Unidades Coronarias, 07June 2012. Via NCBI PubMed.gov
info:pmid/22683044 | info:doi/10.1016/j.medin.2012.04.002
METHODS: We evaluated a total of 161 articles (76% being non-openaccess articles) published in Intensive Care Medicine in the year 2008.Citation data were compared between the two groups up until April 30,2011. Potentially confounding variables for citation counts wereadjusted for in a linear multiple regression model. RESULTS: The mediannumber (interquartile range) of citations of non-open access articleswas 8 (4-12) versus 9 (6-18) in the case of open access articles(p=0.084). In the highest citation range (>8), the citationcount was 13 (10-16) and 18 (13-21) (p=0.008), respectively. The meanfollow-up was 37.5±3 months in both groups. In the 30-35 months afterpublication, the average number (mean±standard deviation) of citationsper article per month of non-open access articles was 0.28±0.6 versus0.38±0.7 in the case of open access articles (p=0.043). Independentfactors for citation advantage were the Hirsch index of the firstsigning author (=0.207; p=0.015) and open access status (=3.618;p=0.006). CONCLUSIONS: Open access publishing and the Hirsch index ofthe first signing author increase the impact of scientific articles.The open access advantage is greater for the more highly citedarticles, and appears in the 30-35 months after publication.

Judit Bar-Ilan, Stefanie Haustein, Isabella Peters, Jason Priem, HadasShema, Jens Terliesner (2012)
Beyond citations:Scholars' visibility on the social Web
arXiv.org> cs > arXiv:1205.5611, 25 May 2012. In17thInternationalConference on Science and Technology Indicators, Montreal, 5-8 Sept.2012, http://sticonference.org/Proceedings/vol1/Bar-Ilan_Beyond_98.pdf
Abstract: Traditionally, scholarly impact and visibility havebeen measured by counting publications and citations in the scholarlyliterature. However, increasingly scholars are also visible on the Web,establishing presences in a growing variety of social ecosystems. Buthow wide and established is this presence, and how do measures ofsocial Web impact relate to their more traditional counterparts? Toanswer this, we sampled 57 presenters from the 2010 Leiden STIConference, gathering publication and citations counts as well as datafrom the presenters' Web "footprints." We found Web presence widespreadand diverse: 84% of scholars had homepages, 70% were on LinkedIn, 23%had public Google Scholar profiles, and 16% were on Twitter. Forsampled scholars' publications, social reference manager bookmarks werecompared to Scopus and Web of Science citations; we found that Mendeleycovers more than 80% of sampled articles, and that Mendeley bookmarksare significantly correlated (r=.45) to Scopus citation counts.

George Lozano, Vincent Lariviere, and Yves Gingras (2012)
The weakeningrelationship between the Impact Factor and papers' citations in thedigital age
arXiv.org > cs > arXiv:1205.4328, 19 May 2012. InJournal of the American Society for Information Science and Technology, Vol 63, No 11, 21402145, November 2012 http://onlinelibrary.wiley.com/doi/10.1002/asi.22731/abstract
From the Abstract: We compare the strength of the relationship betweenjournals' Impact Factors and the actual citations received by theirrespective papers from 1902 to 2009. Throughout most of the 20thcentury, papers' citation rates were increasingly linked to theirrespective journals' Impact Factors. However, since 1990, the advent ofthe digital age, the strength of the relation between Impact Factorsand paper citations has been decreasing. This decrease began sooner inphysics, a field that was quicker to make the transition into theelectronic domain. Furthermore, since 1990, the proportion of highlycited papers coming from highly cited journals has been decreasing, andaccordingly, the proportion of highly cited papers not coming fromhighly cited journals has also been increasing. Should this patterncontinue, it might bring an end to the use of the Impact Factor as away to evaluate the quality of journals, papers and researchers.
See also
George Lozano,The demise of the Impact Factor: Thestrength of the relationship between citation rates and IF is down tolevels last seen 40 years ago, Impact of Social Sciences, 08June 2012: Lozano discusses the recent paper that he co-authored.
Robinson Meyer,Thanksto the Web, Even Scientists Are Reading for the Articles,the Atlantic, July9, 2012
Studyreveals declining influence of high impact factor journals,Université de Montréal News, 07 Nov 2012

Henk Moed (2012)
Doesopen access publishing increase citation or download rates?
Research Trends,No. 28, May 2012
The effect of "Open Access" (OA) on the visibility or impact of scientificpublications is one of the most important issues in the fields ofbibliometrics and information science. During the past 10 yearsnumerous empirical studies have been published that examine this issueusing various methodologies and viewpoints. Comprehensive reviews andbibliographies are given amongst others by OPCIT, Davis and Walters andCraig et al. The aim of this article is not to replicate nor updatethese thorough reviews. Rather, it aims to presents the two mainmethodologies that were applied in these OA-related studies anddiscusses their potentialities and limitations. The first method isbased on citation analyses; the second on usage analyses.

Brian Kelly and Jenny Delasalle (2012)
Can LinkedIn andAcademic.edu Enhance Access to Open Repositories?
Opus: Online Publications Store, University of Bath, May 2012.In OR2012: 7th International Conference on Open Repositories, 9-13 July2012, Edinburgh
From the Abstract: we are witnessing the increasing take-up of a rangeof third-party services such as LinkedIn and Academia which are beingused by researchers to publish information related to theirprofessional activities, including details of their researchpublications. The paper provides evidence which suggests that personaluse of such services can increase the number of downloads by increasingSEO (Search Engine Optimisation) rankings through inbound links fromhighly ranked web sites. A survey of use of such services acrossRussell Group universities shows the popularity of a number of socialmedia services. In the light of existing usage of these services thispaper proposes that institutional encouragement of their use byresearchers may generate increased accesses to institutional researchpublications at little cost to the institution. This paper concludes bydescribing further work which is planned in order to investigate theSEO characteristics of institutional repositories.

Carlos Paiva, Joao da Silveira, and Bianca Ribeiro (2012)
Articleswith short titles describing the results are cited more often
Clinics, 67(5), 509-13, May 2012. Via PubMed Central
info:doi/10.6061/clinics/2012(05)17
From the Abstract: OBJECTIVE: The aim of this study was to evaluate somefeatures of article titles from open access journals and to assess thepossible impact of these titles on predicting the number of articleviews and citations. RESULTS: Short-titled articles had higher viewingand citation rates than those with longer titles. Titles containing aquestion mark, containing a reference to a specific geographicalregion, and that used a colon or a hyphen were associated with a lowernumber of citations. Articles with results-describing titles were citedmore often than those with methods-describing titles. Aftermultivariate analysis, only a low number of characters and titletypology remained as predictors of the number of citations.

Tom Rees, Katherine Ayling-Rouse, and Sheelah Smith (2012)
Accessesversus citations: Why you need to measure both to assess publicationimpact
8th Annual Meeting ofISMPP (International Society for Medical Publication Professionals),Baltimore, MD, April 23-25, 2012
Poster paper. Abstract OBJECTIVE: Article accesses and citations provide 2metrics to assess article impact. However, the relationship between the2 is not constant or well understood. We investigated the relationshipbetween article accesses and citations in 3 general medicine journalswith different journal rankings. RESEARCH DESIGN AND METHODS: Wecollected the numbers of article accesses and citations from arepresentative selection of original research articles published in2009 and 2010 in 3 peer-reviewed, international, online- only,open-access journals: PLoS Medicine, BMC Medicine, and theInternational Journal of General Medicine (IJGM) (SCImago journalranking 1.04, 0.49, and 0.06, respectively). RESULTS: The sampleincluded 104 articles (2 outliers were excluded). CONCLUSION: Therelationship between article accesses and citations varies, with thehighest ratio of citations:access for journals with the highest journalranking. For open-access journals with a low impact factor, overallarticle reach may be higher than expected on the basis of citations.

Patrick Vandewalle (2012)
Code Sharing is Associatedwith Research Impact in Image Processing
Reproducible Research Repository, EPFL, Lausanne, 23 Apr 2012.IEEE Computing inScience and Engineering, Vol 14, No 4, 42-47, July-Aug 2012
http://dx.doi.org/10.1109/MCSE.2012.63
Abstract:In computational sciences such as image processing, the publicationitself is often not enough to allow other researchers to verify theresults by repeating the described experiments. In many cases,supplementary material such as source code and measurement data arerequired, or can at least be very helpful. Still, only approximately10% of recently published papers in image processing have codeavailable online. One of the arguments for not making code available isthe extra time required to prepare the material. In this paper, weclaim that this additional time may be well spent, as the availabilityof code for a publication is associated with an increase in theexpected number of citations. We show this with exploratory analyses ofthe relationship between code availability and the number of citationsfor image processing papers.
Note on open access citation impact onresults (p4): As can be seen in the open access citation studies (suchas the one by Lawrence), papers for which an online version is freelyavailable have an increased number of citations. I did not take thisinto account in my analyses by adding the open access availability asanother variable. Articles that have code available generally also havean online version of the article. The citation effect seen above istherefore the combined effect of the open access availability of thepaper and the availability of code.

Henk Moed (2012)
TheEffect of Open Access upon Citation Impact
Editors' Update, Elsevier.com, 22 Mar 2012
Does Open Access publishing increase citation rates? From amethodological point of view, the debate focuses on biases, controlgroups, sampling, and the degree to which conclusions from case studiescan be generalized. This note does not give a complete overview ofstudies that were published during the past decade but highlights keyevents. An extended version of this paper will be published

Jason Priem, Heather Piwowar, and Bradley Hemminger (2012)
Altmetrics in thewild: Using social media to explore scholarly impact
arXiv.org > cs > arXiv:1203.4745, 20 Mar 2012
From the Abstract: In growing numbers, scholars are integrating socialmedia tools like blogs, Twitter, and Mendeley into their professionalcommunications. The online, public nature of these tools exposes andreifies scholarly processes once hidden and ephemeral. Metrics based onthis activities could inform broader, faster measures of impact,complementing traditional citation metrics. This study explores theproperties of these social media-based metrics or "altmetrics",sampling 24,331 articles published by the Public Library of Science.
See also
Heather Piwowar,Altmetricsshows that citations can't stand up to the full 31 flavours of researchimpact, Impact of Social Sciences, 04 Apr 2012

Xin Shuai, Alberto Pepe, and Johan Bollen (2012)
How theScientific Community Reacts to Newly Submitted Preprints: ArticleDownloads, Twitter Mentions, and Citations
arXiv.org > cs > arXiv:1202.2461, 11 Feb 2012.PLoS ONE, 7(11): e47523, November 1, 2012 http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0047523
doi:10.1371/journal.pone.0047523
Abstract:We analyze the online response of the scientific community to thepreprint publication of scholarly articles. We employ a cohort of 4,606scientific articles submitted to the preprint database arXiv.orgbetween October 2010 and April 2011. We study three forms of reactionsto these preprints: how they are downloaded on the arXiv.org site, howthey are mentioned on the social media site Twitter, and how they arecited in the scholarly record. We perform two analyses. First, weanalyze the delay and time span of article downloads and Twittermentions following submission, to understand the temporal configurationof these reactions and whether significant differences exist betweenthem. Second, we run correlation tests to investigate the relationshipbetween Twitter mentions and both article downloads and articlecitations. We find that Twitter mentions follow rapidly after articlesubmission and that they are correlated with later article downloadsand later article citations, indicating that social media may be animportant factor in determining the scientific impact of an article.

Heekyung Kim (2012)
TheEffect of Free Access on the Diffusion of Scholarly Ideas
MIS Speaker's Series, University of Arizona, 24 January 2012
From the Abstract: By using a dataset from the Social Science ResearchNetwork (SSRN), an open repository of research articles, and employinga natural experiment that allows the estimation of the value of freeaccess separate from confounding factors such as early viewership andquality differential, this study identifies the causal effect of freeaccess on the citation counts. The natural experiment in this study isthat a select group of published articles is posted on SSRN at a timechosen by their authors' affiliated organizations or SSRN, not by theirauthors. Using a difference-in-difference method and comparing thecitation profiles of the articles before and after the posting time onSSRN against a group of control articles with similar characteristics,I estimated the effect of the SSRN posting on citation counts. Thearticles posted on SSRN receive more citations even prior to beingposted on SSRN, suggesting that they are of higher quality. Theircitation counts further increase after being posted, gaining anadditional 10-20% of citations. This gain is likely to be caused by thefree access that SSRN provides.

Jingfeng Xia and Ying Liu (2012)
UsagePatterns of Open Genomic Data
College &Research Libraries, 09 January 2012
Pre-print. Abstract: This paper uses Genome Expression Omnibus (GEO), adata repository in biomedical sciences, to examine the usage patternsof open data repositories. It attempts to identify the degree ofrecognition of data reuse value and understand how e-science hasimpacted a large-scale scholarship. By analyzing a list of 1,211publications that cite GEO data to support their independent studies,it discovers that free data can support a wealth of high qualityinvestigations, that the rate of open data use keeps growing over theyears, and that scholars in different countries show different rates ofcomplying with data sharing policies.

Jingfeng Xia and Katie Nakanishi (2012)
Self-Selectionand the Citation Advantage of Open Access Articles
Online Information Review,36 (1), 2012
(Subscription access required.) From the Abstract: This researchexamines the relationship between the open access availability ofjournal papers in anthropology and their citation conditions. We applya statistical logistic regression model to explore this relationship,and compare two groups of papers  those published in high-rankedjournals and those in low-ranked journals, based on journal impactfactor  to examine the likelihood that open access status iscorrelated to scholarly impact. The results reveal that open accesspapers in general receive more citations. Moreover this research findsthat 1) papers in high-ranked journals do not have a higher open accessrate, and 2) papers in lower-ranked journals have a greater rate ofcitations if they are freely accessible. The findings are contrary tothe existing theory that the higher citation rate of open access papersis caused by authors posting their best papers online.

Patricia Shields, Nandhini Rangarajan, and Lewis Stewart (2012)
OpenAccess Digital Repository: Sharing Student Research with the World
Journal of PublicAffairs Education, 18 (1), 157-81, 2012
Note, this link will download the pdf of the full Winter 2012 journalissue - go to page 157. From the Abstract: We study the impact ofcontent factors and search engine optimization factors on downloadrates of capstone papers. We examined all 290 MPA capstone papers atTexas State University which have been made available through an onlinedigital repository for public consumption. Results show strong supportfor the impact of search engine factors on download rates. Theimplications of high download rates of MPA capstone papers on publicadministration research, practice, and education are discussed in thispaper.

Gunther Eysenbach (2011)
Can TweetsPredict Citations? Metrics of Social Impact Based on Twitter andCorrelation with Traditional Metrics of Scientific Impact
Journal of MedicalInternet Research, 13 (4), 16 December 2011
info:doi/10.2196/jmir.2012
From the Abstract: Between July 2008 and November 2011, all tweetscontaining links to articles in the Journal of Medical InternetResearch (JMIR) were mined. A total of 4208 tweets cited 286 distinctJMIR articles. Highly tweeted articles were 11 times more likely to behighly cited than less-tweeted articles (9/12 or 75% of highly tweetedarticle were highly cited, while only 3/43 or 7% of less-tweetedarticles were highly cited; rate ratio 0.75/0.07 = 10.75, 95%confidence interval, 3.433.6). Top-cited articles can be predictedfrom top-tweeted articles with 93% specificity and 75% sensitivity.Conclusions: Tweets can predict highly cited articles within the first3 days of article publication. Social media activity either increasescitations or reflects the underlying qualities of the article that alsopredict citations, but the true use of these metrics is to measure thedistinct concept of social impact. Social impact measures based ontweets are proposed to complement traditional citation metrics. Theproposed twimpact factor may be a useful and timely metric to measureuptake of research findings and to filter research findings resonatingwith the public in real time.
See also
Gunther Eysenbach,Correction:Can Tweets Predict Citations? Metrics of Social Impact Based on Twitterand Correlation with Traditional Metrics of Scientific Impact,Journal of Medical InternetResearch, 14 (1), 04 January 2012
info:doi/10.2196/jmir.2041
A minor error in the references section in the originally publishedversion of the editorial by Eysenbach (J Med Internet Res2011;13[4]:e123) on the relationship between citations and tweetationshas been corrected; in addition, references being part of the datasetare no longer cited as references. The now corrected problem with thereferences was a formatting/presentation problem only and had noimpact on the study findings.
Haydn Shaughnessy,HowCould Twitter Influence Science (And Why Scientists Are on Board),Forbes, 15 Jan 2012

Claire Bower,Twimpactfactors: can tweets really predict citations? BMJ WebDevelopment Blog, 6 Jan, 2012
"A new paper is kicking up a storm in the world of altmetrics ... the veryfact that it has been published and discussed so widely is surely atestament to the increasing importance of social metrics in evaluatingarticle impact."
Alexis Madrigal,'HighlyTweeted Articles Were 11 Times More Likely toBe Highly Cited', The Atlantic, Jan 13 2012: "if anythinglike thiskind of correlation is found in other fields, a hidden value ofTwitter's network will be revealed. Want to peer a year or two into thefuture of a scientific field?"
Laura Costello,ResearchDigest: Can Tweets Predict Citations? EdLab 02/09/2012: "The article also has fascinating implications for themarketing of academic publications."
Martin Fenner,Domore tweets mean higher citations? If so, Twitter can lead us to thepersonalised journal; pinpointing more research that is relevant toyour interests, Impact of Social Sciences, 09February 2012: "Weneed more research like the Eysenbach paper about what it means ifsomeone is linking to a scholarly paper via social media."

Hyeon-Eui Kim, Xiaoqian Jiang, Jihoon Kim, Lucila Ohno-Machado (2011)
Trendsin biomedical informatics: most cited topics from recent years
Journal of the AmericanMedical Informatics Association, 18 (Suppl 1), 01December 2011
info:pmid/22180873 | info:doi/10.1136/amiajnl-2011-000706
Abstract: Biomedical informatics is a young, highly interdisciplinaryfield that is evolving quickly. It is important to know which publishedtopics in generalist biomedical informatics journals elicit the mostinterest from the scientific community, and whether this interestchanges over time, so that journals can better serve their readers. Itis also important to understand whether free access to biomedicalinformatics articles impacts their citation rates in a significant way,so authors can make informed decisions about unlock fees, and journalowners and publishers understand the implications of open access. Thetopics and JAMIA articles from years 2009 and 2010 that have been mostcited according to the Web of Science are described. To betterunderstand the effects of free access in article dissemination, thenumber of citations per month after publication for articles publishedin 2009 versus 2010 was compared, since there was a significant changein free access to JAMIA articles between those years. Results suggestthat there is a positive association between free access and citationrate for JAMIA articles.

Sears, J. R. (2011)
Data Sharing Effect on Article Citation Rate in Paleoceanography
American Geophysical Union, Fall Meeting 2011
Abstract: The validation of scientific results requires reproducible methods and data. Often, however, data sets supporting research articles are not openly accessible and interlinked. This analysis tests whether open sharing and linking of supporting data through the PANGAEA data library measurably increases the citation rate of articles published between 1993 and 2010 in the journal Paleoceanography as reported in the Thomson Reuters Web of Science database. The 12.85% (171) of articles with publicly available supporting data sets received 19.94% (8,056) of the aggregate citations (40,409). Publicly available data were thus significantly (p=0.007, 95% confidence interval) associated with about 35% more citations per article than the average of all articles sampled over the 18-year study period (1,331), and the increase is fairly consistent over time (14 of 18 years). This relationship between openly available, curated data and increased citation rate may incentivize researchers to share their data.
See also
Michael Diepenbroek,DataSharing Effect on Article Citation Rate in Paleoceanography, KomFor blog, November 27, 2011. Includes data plot.

Henneken, E. and Accomazzi, A. (2011)
Linking to Data -Effect on Citation Rates in Astronomy
arXiv.org > cs > arXiv:1111.3618, 15 Nov 2011. InProceedings of ADASS XXI(Astronomical Data Analysis Software & Systems), Paris, 6-10November 2011
From the Abstract: Is there a difference in citation rates betweenarticles that were published with links to data and articles that werenot? In this presentation we will show this is indeed the case:articles with links to data result in higher citation rates thanarticles without such links.
See also
Linkingto Data - Effect on Citation Rates in Astronomy
Meters, Metrics and More, 03 Jun 2011
Extracts: Using the data holdings of the SAO/NASAAstrophysics Data System, our analysis shows that articles with datalinks are indeed cited more than articles without these links - forthis data set, articles with data links acquired 20% more citations(compared to articles without these links).

Xia, J., Wilhoite, S. K. and Myers, R. L. (2011)
Alibrarian-LIS faculty divide in open access practice
Journal of Documentation,67 (5), 791-805 (2011)
info:doi/10.1108/00220411111164673
(Subscriptionaccess required) From the Abstract: This paper measures the OAavailabilities and citations of scholarly articles from 20 top-rankedLIS journals published in 2006.

Henneberger, S. (2011)
Entwicklungeiner Analysemethode für Institutional Repositories unter Verwendungvon Nutzungsdaten (Development of an analytical method forInstitutional Repositories using usage data)
Thesis, edoc-Server der Humboldt-Universität zu Berlin, 31 Oct 2011
From the English abstract: Download data are the subject of scientificinvestigations, in which the concept of the Citation Impact is appliedto the rate of use of a publication and the so-called Download Impactis formed. Analyzed with nonparametric methods, download data giveinformation about the visibility of electronic publications on theInternet. These methods form the core of NoRA (Non-parametricRepository Analysis). The analytical method NoRA was successfullyapplied to data from Institutional Repositories of four universities.In each case, groups of publications were identified that differedsignificantly in their usage. Similarities in the results revealfactors that influence the usage data, which have not been taken intoaccount previously. The presented results imply further applications ofNoRA but also raise doubts about the value of download data of singlepublications.

Tarrant, D. (2011)
A Study ofEarly Indication Citation Metrics
PhD thesis, ECS EPrints Repository, University of Southampton, 24 Oct2011
From the Abstract: Each new citation establishes a large number ofco-citation relationships between that publication and older materialwhose citation impact is already well established. By taking advantageof this co-citation property, this thesis investigates the possibilityof developing a metric that can provide an earlier indicator of apublications citation impact. This thesis proposes a new family ofco-citation based impact measures, describes a system to evaluate theireffectiveness against a large citation database, and justifies theresults of this evaluation against an analysis of a diverse range ofresearch metrics.

Yuan, S. and Hua, W. (2011)
Scholarlyimpact measurements of LIS open access journals: based on citations andlinks
The Electronic Library,29 (5), 682, 2011
(Subscriptionaccess required) From the Abstract: The study selected 97 LIS OAjournals as a sample and measured their scholarly impact on the basisof citations and links. The citation counts in WoS, coverage in LISA,Web links, WIFs and Page Rank of the journals are retrieved andcalculated, and correlations between citation counts, links, pages,WIFs, and Page Rank are also analyzed. The results indicate that LIS OAjournals have become a significant component of the scholarlycommunication system.

Priem, J., Piwowar, H. and Hemminger, B. (2011)
Altmetricsin the wild: An exploratory study of impact metrics based on socialmedia
PosteratMetrics 2011:Symposium on Informetric and Scientometric Research,New Orleans, LA, 12 October
Extracts: As growing numbers ofscholars publicly read, bookmark, share, discuss, and rate using onlinetools, these invisible impacts are beginning to be seen. Becausemeasurements of these new traces may inform alternatives to traditionalcitation metrics, they been dubbed altmetrics. The goal of this studyis to better understand the potential of altmetrics. We gatheredaltmetrics for a large sample of scholarly articles all 24,334articles published by the Public Library of Science (PLoS) beforeDecember 23, 2010.

Wang, M.-L. (2011)
The impactof open access journals on library and information scientists' researchin Taiwan
Universiti Teknologi Mara Digital Repository, 07 Oct2011. InAsia-PacificConference On Library & Information Education &Practice 2011 (A-LIEP2011), 22-24 June 2011, Malaysia
Fromthe Abstract: the objectives of the study is to explore the scholarlyproductivity of LIS scholars in Taiwan, to find out what articles theypublish and OA articles as a percentage of all titles, and to calculatethe mean citation rate of open access articles and articles not freelyavailable online. Todetermine whether a difference in research impact existed, two researchimpact indicators were used, that is, open access articles as apercentage of all published titles and mean citation rate of openaccess articles and those not freely available online. Data onpublished articles with citation counts by the LIS scholars in Taiwanfrom 2000 to 2009 was collected from the ACI Database and SocialScience Citation Index Database. The study shows that for 72 LISscholars who were subjects of the investigation, 64 of them hadpublished 745 articles within the previous ten years: 679 articles inChinese and 66 articles in English; 499 of these were OA articles, and264 were non-OA articles; OA articles constituted 66.98% of the totalnumber of academic articles. The mean citation rate of OA versus non-OAarticle citation was 1.29.

Chuanfu Chen, Yuan Yu, Qiong Tang, Kuei Chiu, Yan Rao, Xuan Huang andKai Sun (2011)
Assessingthe authority of free online scholarly information
Scientometrics, 02 Oct 2011
(Subscriptionaccess required. Online preview.) From the Abstract: Using a modifiedversion of Jim Kapouns Five criteria for evaluating web pages asframework, this research selected 32 keywords from eight disciplines,inputted them into three search engines (Google, Yahoo and AltaVista)and used Analytic Hierarchy Process to determine the weights. The firstbatches of results (web pages) from keyword searching were selected asevaluation samples (in the two search phases, the first 50 and 10results were chosen, respectively), and a total of 3,134 samples wereevaluated for authority based on the evaluation framework. The resultsshow that the average authority value for free online scholarlyinformation is about 3.63 (out of five), which is in the fair level(3  Z < 4) (Z is the value assigned to each sample). About 41%ofall samples collected provide more authoritative scholarly information.Different domain names, resource types, and disciplines of free onlinescholarly information perform differently when scored in terms ofauthority. In conclusion, the authority of free online scholarlyinformation has been unsatisfactory, and needs to be improved.

Davis, P. (2011)
Dodiscounted journal access programs help researchers in sub-SaharanAfrica? A bibliometric analysis
eCommons@Cornell, 23 Sep 2011. InLearned Publishing,Vol. 24, No. 4, October 2011, pp. 287-298
Abstract: Prior research hassuggested that providing free and discounted access to the scientificliterature to researchers in low-income countries increases articleproduction and citation. Using traditional bibliometric indicators forinstitutions in sub-Saharan Africa, we analyze whether institutionalaccess to TEEAL (a digital collection of journal articles inagriculture and allied subjects) increases: 1) article production; 2)reference length; and 3) number of citations to journals included inthe TEEAL collection. Our analysis is based on nearly 20,000articlescontaining half a million referencespublished between 1988and 2009 at 70 institutions in 11 African countries. We report thataccess to TEEAL does not appear to result in higher article production,although it does lead to longer reference lists (an additional 2.6references per paper) and a greater frequency of citations to TEEALjournals (an additional 0.4 references per paper), compared tonon-subscribing institutions. We discuss how traditional bibliometricindicators may not provide a full picture of the effectiveness of freeand discounted literature programs.

Jeffrey Furman and Scott Stern (2011)
Climbingatop the Shoulders of Giants: The Impact of Institutions on CumulativeResearch
American Economic Review,101 (5), 1933-63, August 2011
Fromthe Abstract: This paper assesses the impact of a specific institution,a biological resource center, whose objective is to certify anddisseminate knowledge. We disentangle the marginal impact of thisinstitution on cumulative research from the impact of selection, inwhich the most important discoveries are endogenously linked toresearch-enhancing institutions. Exploiting exogenous shifts ofbiomaterials across institutional settings and employing adifference-in-differences approach, we find that effective institutionsamplify the cumulative impact of individual scientific discoveries.From the paper: Our empirical analysis focuses on whether articlesassociated with materials exogenously shifted into a BRC receive aboost in citations after their deposit into the BRC, controlling forarticle-specific fixed effects and fixed effects for article age andcalendar year. Our setting allows us to evaluate both models thatinclude a control sample and models that rely exclusively on variationin the timing and date of the treatment of the deposit of thebiomaterial into the BRC. Both approaches provide evidence for themarginal impact of BRCs on subsequent knowledge; the post-depositcitation boost is estimated to be between 57 percent and 135 percentacross different specifications. Empirical checks of our keyidentification assumptions reinforce our overall findings. We find thatthe marginal impact of BRC deposit is marginally higher for articlespublished in less prestigious journals and that the citation boost isconcentrated in follow-on research articles involving more complexsubject matter. (Paper extract from copy at:http://www.econ.tuwien.ac.at/hanappi/Lehre/Economic%20Policy/2012/Furman_2011.pdf)
See also these2002and2006papers of the same title by the same authors:
News articles on this paper:
Peter Dizikes,Howresearch goes viral,MIT News, 10 January 2012: When things become more open, it'snot simply that you get more research, but you get more diverse research,Stern notes.

Peter Ingwersen and Anita Elleby (2011)
DoOpen Access Working Papers Attract more Citations Compared to PrintedJournal Articles from the same Research Unit?
RoyalSchool of Information and Library Science, Copenhagen, (2011). InProceedings 13thInternational Conference of the International Societyfor Scientometrics & Informetrics, Durban, SouthAfrica, 4-7 July2011
Abstract: This paper presents the results of an empirical casestudy of the characteristics of citations received by 10 open accessnon-peer reviewed working papers published by a prestigiousmultidisciplinary, but basically social science research institute,compared to 10 printed peer reviewed journal articles published in thesame year (2004) by the same institute and predominantly by the sameauthors. The study analyzes the total amount of citations and citationimpact observed in Web of Science (WoS) and Google Scholar (GS)received during the five-year period 2004-09 (February) by the twopublication types, the citation distributions over the individualsample publications and observed years as well as over external,institutional and personal self-citations. The institute concerned isthe Danish Institute for International Studies (DIIS), Copenhagen. Theresults demonstrate that the open access working papers publiclyaccessible through the DIIS e-archive became far less cited than thecorresponding sample of DIIS journal articles published in printedform. However, highly cited working papers have higher impact than theaverage of the lower half of cited articles. Citation time series showidentical distinct patterns for the articles in WoS and GS and workingpapers in GS, more than doubling the amount of citations receivedthrough the latter source.
See alsoOpenaccess working papers not good enough, ScienceNordic.com,December 7, 2011

McGreal, R. and Chen, N.-S. (2011)
AUPress:A Comparison of an Open Access University Press with Traditional Presses
Educational Technology& Society, 14 (3), 231-9 (2011)
Abstract:This study is a comparison of AUPress with three other traditional(non-open access) Canadian university presses. The analysis is based onthe rankings that are correlated with book sales on Amazon.com andAmazon.ca. Statistical methods include the sampling of the salesranking of randomly selected books from each press. The results ofone-way ANOVA analyses show that there is no significant difference inthe ranking of printed books sold by AUPress in comparison withtraditional university presses. However, AUPress, can demonstrate asignificantly larger readership for its books as evidenced by thenumber of downloads of the open electronic versions.

Davis, P. and Walters, W. (2011)
Theimpact of free access to the scientific literature: a review of recentresearch
Journal of the Medical Library Association, 99 (3), 208-217, July 2011
doi: 10.3163/1536-5050.99.3.008
From the Abstract: The paper reviews recent studies that evaluate theimpact of free access (open access) on the behavior of scientists asauthors, readers, and citers in developed and developing nations. Italso examines the extent to which the biomedical literature is used bythe general public. Researchers report that their access to thescientific literature is generally good and improving. For authors, theaccess status of a journal is not an important consideration whendeciding where to publish. There is clear evidence that free accessincreases the number of article downloads, although its impact onarticle citations is not clear. Recent studies indicate that largecitation advantages are simply artifacts of the failure to adequatelycontrol for confounding variables. The effect of free access on thegeneral public's use of the primary medical literature has not beenthoroughly evaluated.

Shafi, Sheikh Mohammad and Bhat, Mohammad Haneef (2011)
TheImpact of Open Access Contributions: Developed and Developing WorldPerspectives
Proceedings of the 15thInternational Conference on Electronic Publishing,Istanbul, Turkey, 22 Jun 2011
From the Abstract: The study explores the research impact of OpenAccess research articles across the globe with a view to test thehypothesis that OA research contributions emanating from developingcountries receive equal citations (subsequently resultant researchimpact) as those from the developed world. The study covers 5639research articles from 50 Open Access DOAJ based Medical Sciencesjournals covering the period from 2005 to 2006. The research articlesfrom the developed countries receive higher number of citations(subsequently resultant research impact) compared to those of thedeveloping world. The study may help and pave way for framing policiesand strategies to increase the impact of research in the developingworld.

Yan, K.-K. and Gerstein, M. (2011)
TheSpread of Scientific Information: Insights from the Web UsageStatistics in PLoS Article-Level Metrics
PLoS ONE, 6 (5), 16 May 2011
info:doi/10.1371/journal.pone.0019917
From the Abstract: In this work, we focus on a community of scientistsand study, in particular, how the awareness of a scientific paper isspread. Our work is based on the web usage statistics obtained from thePLoS Article Level Metrics dataset compiled by PLoS. We found that thespread of information displays two distinct decay regimes: a rapiddownfall in the first month after publication, and a gradual power lawdecay afterwards. We identified these two regimes with two distinctdriving processes: a short-term behavior driven by the fame of a paper,and a long-term behavior consistent with citation statistics.

Xia, J. (2011)
PositioningOpen Access Journals in a LIS Journal Ranking
College &Research Libraries, 16 May 2011
From the Introduction: some OA journals have successfully builtreputations, attracting high-quality articles and sizable numbers ofcitations. This research is an attempt to add selected OA journals tothe journal quality rankings using library and information science(LIS) as an example.

Sandra Miguel, Zaida Chinchilla-Rodriguez, Félix de Moya-Anegón (2011)
Openaccess and Scopus: A new approach to scientific visibility from thestandpoint of access
Journal of the AmericanSociety for Information Science and Technology, publishedonline: 11 April 2011
Fromthe Abstract: This study shows a new approach to scientific visibilityfrom a systematic combination of four databases: Scopus, the Directoryof Open Access Journals, Rights Metadata for Open Archiving(RoMEO)/Securing a Hybrid Environment for Research Preservation andAccess (SHERPA), and SciMago Journal Rank, and provides an overall,global view of journals according to their formal OA status. Theresults primarily relate to the number of journals, not to the numberof documents published in these journals, and show that in all thedisciplinary groups, the presence of green road journals widelysurpasses the percentage of gold road publications. The benefits of OAon visibility of the journals are to be found on the green route, butparadoxically, this advantage is not lent by the OA, per se, but ratherby the quality of the articles/journals themselves regardless of theirmode of access.

Liu Xue-li, Fang Hong-ling and Wang Mei-ying (2011)
Correlationbetween Download and Citation and Download-citation DeviationPhenomenon for Some Papers in Chinese Medical Journals
Serials Review,Vol. 37, No. 3, September 2011, 157-161, available online 7 April 2011
From the Abstract: The authors collected the numbers of citations anddownloads from 2005 to 2009 of papers in five Chinese generalophthalmological journals published in 2005 from the Chinese AcademicJournals Full-text Database and the Chinese Citation Database inChinese National Knowledge Infrastructure (CNKI) to determine thecorrelation between download and citation and the peak time of downloadfrequency (DF). The citations from 2000 to 2009 of papers published in2000 were collected to determine the peak time of citation frequency(CF) of medical papers. There is a highly positive correlation betweenDF and CF (r = 4.91, P = 0.000).

Davis, P. (2011)
Openaccess, readership, citations: a randomized controlled trial ofscientific journal publishing
The FASEB Journal(Journal of the Federation of American Societies for ExperimentalBiology), 30 Mar 2011
info:doi/10.1096/fj.11-183988
Abstract: Does free access to journal articles result in greaterdiffusion of scientific knowledge? Using a randomized controlled trialof open access publishing, involving 36 participating journals in thesciences, social sciences, and humanities, we report on the effects offree access on article downloads and citations. Articles placed in theopen access condition (n=712) received significantly more downloads andreached a broader audience within the first year, yet were cited nomore frequently, nor earlier, than subscription-access control articles(n=2533) within 3 yr. These results may be explained by socialstratification, a process that concentrates scientific authors at asmall number of elite research universities with excellent access tothe scientific literature. The real beneficiaries of open accesspublishing may not be the research community but communities ofpractice that consume, but rarely contribute to, the corpus ofliterature.
See also this author's earlierdissertation.
News articles on this paper:
Corbyn, Z.,Openaccess articles not cited more, finds study, The GreatBeyond: Nature news blog, 01 Apr 2011: "We do leave open thepossibility that there is a real citation effect as a result of selfarchiving but that we simply do not have the statistical power todetect it," says Davis. (In comments appended to this article Davis says: "the piece is not balanced. Zoe focuses on the weaknesses of the study and not its strengths")
Wieder, B.,OpenAccess Does Not Equal More Citations, Study Finds, WiredCampus,Chronicle ofHigher Education, April 1, 2011: Mr. Davis says he doesn'tsee his study as a blow to open access - if anything, he thinks it callsinto question the wisdom of looking only at citation counts to measurethe impact of a journal article, particularly given the ease oftracking article downloads online. "Twenty years ago, there was no wayof measuring readership," he says.

David Crotty,Gamingthe System: Do Promises of Citation Advantage Go Too Far?Scholarly Kitchen, Apr 5, 2011: The Scholarly Kitchen's own Phil Davis (consider this yourconflict-of-interest statement) attempts to do away with selection biasby looking at a randomized set of papers. The access status of thearticles in the study (OA or under subscription-access control) wasdetermined at random, not by author or editorial choice. Therandomization is important here, as it allows Davis to compare equalgroups of articles and control for other sources of bias. The studyshowed that OA articles were cited no more frequently, nor earlier,than subscription-access control articles within 3 years. Davis studyshows that OA articles are more widely read than non-OA articles, andthis is a significant benefit of OA publishing. If the goal ofthe OA movement is to create a scientific literature with a broader reach,then it is succeeding admirably."

Donovan, J. M. and Watson, C. A. (2011)
CitationAdvantage of Open Access Legal Scholarship
Social Science Research Network (SSRN), 5 March 2011. InLaw Library Journal, 103 (4), Fall 2011, 553-573 http://www.aallnet.org/main-menu/Publications/llj/Vol-103/Fall-2011/2011-35.pdf.Also in UKnowledge, University of Kentucky, 2011http://works.bepress.com/james_donovan/64/
Abstract: To date, there have been no studies focusing exclusively on the impactof open access on legal scholarship. We examine open access articlesfrom three journals at the University of Georgia School of Law andconfirm that legal scholarship freely available via open accessimproves an articles research impact. Open access legal scholarship which today appears to account for almost half of the output of lawfaculties  can expect to receive 50% more citations than non-openaccess writings of similar age from the same venue.

David Crotty,Gamingthe System: Do Promises of Citation Advantage Go Too Far?Scholarly Kitchen, Apr 5, 2011: "The authors conclusion that open access legal scholarship... can expect to receive 58% more citations than non-open accesswritings of similar age from the same venue must be questioned. Didthey really compare OA articles with non-OA articles under similarconditions? Or did they in fact find a citation advantage for articlesthat are available in any fashion online versus those that are not? Inthe law journal study, unequal comparison groups makes it unclearwhether the authors are measuring access or something else entirely.The study also falls prey to potential issues of selection bias."

Xu, L., Liu, J. and Fang, Q. (2011)
Analysison open access citation advantage: an empirical study based on Oxfordopen journals
iConference '11,Proceedings of the 2011 iConference, Seattle, 11February 2011
Abstract: This study takes 12,354 original research articles which werepublished in 93 Oxford Open journals in 2009 as a sample, and carriesout statistic analyses on the citation frequency that these articleshave received by July 2010 to validate 3 hypotheses: (1) there iscitation advantage for open access articles (OACA) published in OxfordOpen journals over the non-OA ones; (2) OACA varies with disciplines;(3) there is some correlation between the impact factors (IFs) of OxfordOpen journals and the OACA of their open access articles. This studydiscovers that: there exists OACA for open access articles, in thiscase 138.87% higher over non-OA ones; different subjects have differentOACAs, and Humanities journals in Oxford Open have even a negativeOACA; Oxford Open journals with lower IFs have stronger OACAs thanthose with higher IFs.

McCabe, M. J. and Snyder, C. M. (2011)
DidOnline Access to Journals Change the Economics Literature?
Social Science Research Network, SSRN, January 23, 2011
Abstract: Doesonline access boost citations? The answer has implications for issuesranging from the value of a citation to the sustainability ofopen-access journals. Using panel data on citations to economics andbusiness journals, we show that the enormous effects found in previousstudies were an artifact of their failure to control for articlequality, disappearing once we add fixed effects as controls. Theabsence of an aggregate effect masks heterogeneity across platforms:JSTOR boosts citations around 10%; ScienceDirect has no effect. Weexamine other sources of heterogeneity including whether JSTOR benefits"long-tail" or "superstar" articles more.
See also Kolowich, S.,Questioningthe 'Citation Advantage',Inside Higher Education, February10, 2011: investigates reaction to this paper. "While McCabe and Snyder actually studied the effect ofeconomics and business articles being available online as opposed tojust in print -- not the effect of articles being online and free --they nevertheless believe that their findings cast doubt on thesupposed citation advantage of open-access articles, because, theycontend, their study measures whether ease of access affects citationvolume. ... contrary to its provocative assertion about the lack ofevidence that free online access performs better, their paper does notaddress the citation advantage of free versus not free, Harnad says,and therefore cannot convincingly refute studies that do."

Saadat, R. and Shabani, A. (2011)
Investigatingthe Citations Received by Journals of Directory of Open Access Journalsfrom ISI Web of Sciences Articles
International Journal ofInformation Science and Management, 9 (1), 57-74, January2011
From the Abstract: In this research, the citations received by DOAJsjournals from the ISI Web of Sciences articles in 2003 to 2008 werestudied and compared. The citations received by the journals in fivefields (Arts & Humanities, Social Sciences, Pure Sciences,Technology & Engineering, and Health & MedicalSciences) as well as the difference among the citations received byDOAJs journals in the above- mentioned five fields were examined. Theresearch method is citation analysis and the research data have beencollected by means of Cited Reference Search in the ISI Web of Science.The English-language journals in DOAJ were chosen, and no sampling wasused. Findings showed that out of 2953 journals, 321 journals (10.87%)received citations, and the total citations received by these journalswere 19050 with the mean of 6.45 per journal; the journals in PureSciences received most citations (10116 citations, equal to 53.1%), andthe ones in Arts & Humanities received the least citations (701citations, equal to 3.68%).

Lee, K., Brownstein, J. S., Mills, R. G., Kohane, I. S. (2010)
DoesCollocation Inform the Impact of Collaboration?
PLoS ONE, 5(12), e14279, 15 Dec 2010
From the Abstract: Despite the positive impact of emergingcommunication technologies on scientific research, our results providestriking evidence for the role of physical proximity as a predictor ofthe impact of collaborations. From the Discussion: There have beennumerous articles that reported Open Access publications have higherchance to be cited more. It may be that publications in Open Accessjournals have higher citation, which may not necessarily be related tocollaboration and collocation. However, its impact on our results isuncertain as there are also growing number of articles that arereporting no evidence of Open Access advantage in different disciplines.

Xia, J., Myers, R. L. and Wilhoite, S. K. (2010)
Multipleopen access availability and citation impact
Journal of InformationScience, 37 (1): 19-28, published online 10 Dec 2010
Abstract: This research examines the relationship between multiple openaccess (OA) availability of journal articles and the citation advantageby collecting data of OA copies and citation numbers in 20 top libraryand information science journals. We discover a correlation between thetwo variables; namely, multiple OA availability of an article has apositive impact on its citation count. The statistical analysis revealsthat for every increase in the availability of OA articles, citationnumbers increase by 2.348.

Davis. P. (2010)
DoesOpen Access Lead to Increased Readership and Citations? A RandomizedControlled Trial of Articles Published in APS Journals
The Physiologist,53 (6), December 2010
Extracts. Introduction: In order to isolate the effect of access onreadership and citations, we conducted a randomized controlled trial ofopen access publishing on articles published electronically in 11 APSjournals. This report details the findings three years after thecommencement of the experiment. Discussion: The results of thisexperiment suggest that providing free access to the scientificliterature may increase readership (as measured by article downloads)and reach a larger potential audience (as measured by unique visitors),but have no effect on article citations. These results are consistentwith an earlier report of the APS study after one year and the resultsof other scientific journals after two years. The fact that we observean increase in readership and visitors for Open Access articles but nocitation advantage suggests that scientific authors are adequatelyserved by the current APS model of information dissemination, andsecond, that the additional readership is taking place outside thiscore research community.

Düzyol, G., Taskin, Z. and Tonta, Y. (2010)
Mappingthe Intellectual Structure of Open Access Field Through Co-citations
E-LIS, 29 November 2010
InIFLA Satellite Pre-conference: Open Access to Science InformationTrends, Models and Strategies for Libraries, 6-8 August 2010, Crete(Greece)
also athttp://yunus.hacettepe.edu.tr/~tonta/yayinlar/tonta-duzyol-taskin-oa.pdf
From the Abstract: This paper maps the intellectual structure of openaccess based on 281 articles that appeared in professional literatureon the topic between 2000 and 2010. Using bibliometric and co-citationanalyses, co-citation patterns of papers are visualized through anumber of co-citation maps. CiteSpace was used to analyze and visualizeco-citation maps. Maps show major areas of research, prominentarticles, major knowledge producers and journals in the field of openaccess. The letter written by Steven Lawrence (Free onlineavailability substantially increases a papers impact, 2001) appearsto be the most prominent source as it was cited the most. Thepreliminary findings show that open access is an emerging researchfield. Findings of this study can be used to identify landmark papersalong with their impact in terms of providing different perspectivesand engendering new research areas.

Davis. P. (2010)
Access,Readership, Citations: A Randomized Controlled Trial Of ScientificJournal Publishing
eCommons@Cornell, 20 October 2010
From the abstract: This dissertation explores the relationship of OpenAccess publishing with subsequent readership and citations. It reportsthe findings of a randomized controlled trial involving 36 academicjournals produced by seven publishers in the sciences, social sciencesand humanities. At the time of this writing, all articles have aged atleast two years. Articles receiving the Open Access treatment receivedsignificantly more readership (as measured by article downloads) andreached a broader audience (as measured by unique visitors), yet werecited no more frequently, nor earlier, than subscription-access controlarticles. A pronounced increase in article downloads with nocommensurate increase in citations to Open Access treatment articlesmay be explained through social stratification, a process whichconcentrates scientific authors at elite, resource-rich institutionswith excellent access to the scientific literature. For this community,access is essentially a non-issue.

Gargouri, Y., Hajjem, C., Lariviere, V., Gingras, Y., Brody, T., Carr,L. and Harnad, S. (2010)
Self-Selected or Mandated, Open Access Increases Citation Impact for Higher QualityResearch
PLoS ONE 5(10): e13636, October 18, 2010, doi:10.1371/journal.pone.0013636
Also in ECS EPrints, 10 Feb 2010, http://eprints.ecs.soton.ac.uk/18493/ (this version includes the paper and fullsupplemental materials, including further analyses and responses to comments and feedback), and in arXiv, arXiv:1001.0361v2 [cs.CY], 3 Jan 2010
Abstract
Background. Articles whose authors have supplemented subscription-based access to the publisher's version by self-archiving their own final draft to make it accessible free for all on the web (Open Access, OA) are cited significantly more than articles in the same journal and year that have not been made OA. Some have suggested that this OA Advantage may not be causal but just a self-selection bias, because authors preferentially make higher-quality articles OA. To test this we compared self-selective self-archiving with mandatory self-archiving for a sample of 27,197 articles published 20022006 in 1,984 journals.
Methdology/Principal Findings. The OA Advantage proved just as high for both. Logistic regression analysis showed that the advantage is independent of other correlates of citations (article age; journal impact factor; number of co-authors, references or pages; field; article type; or country) and highest for the most highly cited articles. The OA Advantage is real, independent and causal, but skewed. Its size is indeed correlated with quality, just as citations themselves are (the top 20% of articles receive about 80% of all citations).
Conclusions/Significance. The OA advantage is greater for the more citable articles, not because of a quality bias from authors self-selecting what to make OA, but because of a quality advantage, from users self-selecting what to use and cite, freed by OA from the constraints of selective accessibility to subscribers only. It is hoped that these findings will help motivate the adoption of OA self-archiving mandates by universities, research institutions and research funders.

Comments below refer to the PLoS ONE publication
Howard, J.,IsThere an Open-Access Citation Advantage? The Chronicle ofHigher Education, 19 Oct 2010:
Seeks to reignite the earlier debate around the preprint of the newPLoS ONE paper. And succeeds - see reader responses attached to thearticle, some extracts below.
Harnad, S.,Correlation,Causation, and the Weight of Evidence, Open AccessArchivangelism, 20 Oct 2010: One can only speculate on thereasons why some might still wish to cling to the self-selection biashypothesis in the face of all the evidence to date. The straightforwardcausal relationship is the default hypothesis, based on bothplausibility and the cumulative weight of the evidence. Hence theburden of providing counter-evidence to refute it is now on theadvocates of the alternative.
sk_griffhoven, October 20, 2010: The authors were unable to control forinstitutional effects in their model. While deposit mandates might beresponsible for the results they report, they might not, and I dontsee how mandates would outperform self-selection. Most importantly,there is no basis for making a causal claim. I agree with Philip David:the authors greatly overstate their results.
stevanharnad, October 21, 2010: The causal claim is not that mandatedOA out-performs self-selected OA, but that self-selected OA does *not*out-perform mandated OA, hence OA is causal.
signofthefourwinds, October 21, 2010: It doesnt make sense thatresearchers suddenly give up their habit of consulting certaindatabases to go search in Google Scholar. What user behavior changeaccounts for the OA advantage?
stevanharnad, October 22, 2010: The OA Advantage is not just, orprimarily, a convenience or laziness effect (though some of that nodoubt contributes to it too): It is not that scholars have becomesloppy, relying on google scholar instead of consulting moreestablished databases. It is that when their institution cannot affordaccess to articles they need, they must make do with only those of themthat they can access for free online.
Patrick Chardenet, October 22, 2010: I dont think that the problem isto know if Open Access reinforces or not the number of citations. It israther a question of knowing if the measurement of science by themeasurement of the number of citations has an interest for thescientific development.
stevanharnad, October 22, 2010: In a nutshell, citations are not thegoal of research; the goal is that the research should be read, usedand built upon, in further research and applications. And citations area measure of that. But for research to be read, used and built upon, ithas to be accessible. That is why and how OA increases citations.
Fenner, M.,Newin PLoS ONE: Citation rates of self-selected vs. mandated Open Access,PLoS Blogs, Gobbledygook, 19 Oct 2010: "I feel thatthe paper comes a little short. Yes, they did a very detailed analysisof the citation behavior, and take into account important cofactors.But the reader is left with the impression that mandatoryself-archiving of post-prints in institutional repositories is the onlyreasonable Open Access strategy, and the introduction and discussionaccordingly leave out some important arguments." Notable for thesubstantive discussion that follows between the blogger (Fenner) andone of the principal authors of the paper (Harnad).
Harnad, S.,ComparingOA and Non-OA: Some Methodological Supplements, Open AccessArchivangelism, 19 Oct 2010. Respondsto Fenner: "If we have given "theimpression that mandatory self-archiving of post-prints ininstitutional repositories is the only reasonable Open Accessstrategy," then we have succeeded in conveying the implication of ourfindings."

Comments below refer to the preprint
Davis, P.,Doesa Citation Advantage Exist for Mandated Open Access Articles?the scholarly kitchen, Jan 7, 2010; "Gargouri reports thatinstitutionally-mandatedOA papers received abouta 15% citation advantage over self-selected OA papers, which seemssomewhat counter-intuitive.  If better articles tend to beself-archived, their reasoning goes, we should expect that papersdeposited under institutional-wide mandates wouldunder-performthosewhere the authors select which articles to archive. Theauthors ofthis paper deal, rather unscientifically, with this inconvenient truthwith a quick statistical dismissal  that their finding "might be dueto chance or sampling error. In sum, this paper tests an interestingtestable hypothesis on whether mandatory self-archiving policies arebeneficial to their authors in terms of citations. Theirunorthodox methodology, however, results in some inconsistent andcounter-intuitive results that are not properly addressed in theirnarrative."
The following were among thecommentsadded to the above blog by Davis.
Harnad, S., Jan 7, 2010: "Mandated OA Advantage? Yes, the fact that thecitation advantage of mandated OA was slightly greater than that ofself-selected OA is surprising, and if it proves reliable, it isinteresting and worthy of interpretation. We did not interpret it inour paper, because it was the smallest effect, and our focus was ontesting the Self-Selection/Quality-Bias hypothesis, according to whichmandated OA should have little or no citation advantage at all, ifself-selection is a major contributor to the OA citation advantage."
Gaule,P., Jan 7, 2010: "the paper does not appear to include controls forinstitutions of the authors of the control sample. This is particularlyworrisome when comparing papers originating from CERN which arguablydoes cutting edge physics to the control papers. The key issue in thispaper seems to be interpreting the mandated open access versusself-selected open access. The authors find and point out that themandates actually result in compliance of around 60%. However, theyhave little to say on what is going on here and why papers end up inthe compliant group or not. I am not sure what conclusions can beinferred from this comparison of two types of self-selection, at leastone of which is not well understood."
Harnad, S., Jan 8, 2010: "(2)THE SPECIAL CASE OF CERN: With so few institutional mandates, its notyet possible to control for institutional quality. But CERN is indeed aspecial case; when it is removed, however, it does not alter thepattern of our results. (3) SELECTIVE COMPLIANCE? Mandate compliance isnot yet 100%, so some form of self-selection still remains a logicalpossibility, but we think this is made extremely improbable when thereis no decline in the OA Advantage even when mandates quadruple the OArate from the spontaneous self-selective baseline of 15% to the currentmandated average of 60%."
Schneider, J. W., Jan 8, 2010: "the claimof causality seems well beyond the mark. Neither former research northe current regression design permits any casual claims."
Harnad,S., Jan 9, 2010: "CAUSALITY: We agree that causality is difficult todemonstrate with correlational statistics. However, we note that thehypothesis that (2a) making articles open access causes them to be moreciteable and the hypothesis that (2b) being more citeable causesarticles to be made open access are both causal hypotheses."
Harnad, S.,OpenAccess: Self-Selected, Mandated & Random; Answers &Questions,Open Access Archivangelism, February 8, 2010: "What follows is what wehope will be found to be a conscientious and attentive series ofresponses to questions raised by Phil Davis about our paper (currentlyunder refereeing) -- responses for which we did further analyses of ourdata (not included in the draft under refereeing)."

Zawacki-Richter, O., Anderson, T. and Tuncay, N. (2010)
TheGrowing Impact of Open Access Distance Education Journals: ABibliometric Analysis
The Journal of DistanceEducation / Revue de l'Éducation à Distance, 24(3), 2010
Fromthe Abstract: we examine 12 distance education journals (6 open and 6published in closed format by commercial publishers). Using an onlinesurvey completed by members of the editorial boards of these 12journals and a systematic review of the number of citations per article(N = 1,123) and per journal issue between 2003 and 2008, we examine theimpact, and perceived value of the 12 journals. We then computedifferences between open and closed journals. The results reveal thatthe open access journals are not perceived by distance eductationeditors as significantly more or less prestigious than their closedcounterparts. The number of citations per journal and per article alsoindicates little difference. However we note a trend towards morecitations per article in open access journals. Articles in open accessjournals are cited earlier than in non-open access journals.

Kim, J. (2010)
Facultyself-archiving: Motivations and barriers
Journal of the AmericanSociety for Information Science and Technology, 16Jul 2010
info:doi/10.1002/asi.21336
Thispaper is broader than open access impact, but one part of theinvestigation looked at it.
From the Discussion: A few interviewees didbelieve that self-archiving resulted in their research work being citedmore frequently, although 13 interviewees were unsure about thepositive relationship between self-archiving and the citation rate.Professors even considered self-archiving to serve other purposes, forexample, to recruit graduate students, or to find collaborators,instead of increasing the impact of research. In fact, fiveinterviewees expressed uncertainty regarding whether self-archivingwould improve professional recognition. Four other interviewees did notexpect self-archiving to increase academic recognition, as theybelieved this related more to the quality of research itself, ratherthan merely making it publicly accessible. These findings suggestedthat the majority of faculty participants in this study were unaware ofthe evidence of a citation advantage from OA previously identified byseveral studies. Without noticing the evidence, professors tend not toexpect a citation advantage from self-archiving; however, they seebenefits from the user side through self-archiving. This study showsthat faculty have diverse opinions about citation rates and academicrecognition related to self-archiving.

Strotmann, A. and Zhao, D. (2010)
Impactof Open Access on stem cell research: An author co-citation analysis
76th IFLA GeneralConference and Assembly, Gothenburg, Sweden, 22Jun 2010
Abstract:We explore the impact of Open Access (OA) on stem cell research througha comparison of research reported in OA and in non-OA publications.Using an author co-citation analysis method, we find that (a) OA andnon-OA publications cover similar major research areas in the stem cellfield, but (b) a more diverse range of basic and medical research isreported in OA publications, while (c) biomedical technology areasappear biased towards non-OA publications. From the Introduction: manystudies have investigated whether OA publication of research resultshas a positive effect on the citation ranking of those publications ...we approach the comparison between OA and non-OA publishing of researchresults from a somewhat different perspective. We explore whether thereare substantial differences between the intellectual structure of aresearch field when viewed from either the point of view of the OApublications in that field or from that of its non-OA publications.

Jacques, T. S. and Sebire, N. J. (2010)
Theimpact of article titles on citation hits: an analysis of general andspecialist medical journals
JRSM Short Reports,1 (1), 2, 01 Jun 2010
info:doi/10.1258/shorts.2009.100020
More factors to consider in citation impact assessment.
From the Abstract:We hypothesized that specific features of journal titles may be relatedto citation rates. We reviewed the title characteristics of the 25 mostcited articles and the 25 least cited articles published in 2005 ingeneral and specialist medical journals including the Lancet, BMJ andJournal of Clinical Pathology. The title length and construction werecorrelated to the number of times the papers have been cited to May2009. Results The number of citations was positively correlated withthe length of the title, the presence of a colon in the title and thepresence of an acronym. Factors that predicted poor citation includedreference to a specific country in the title. Conclusions These datasuggest that the construction of an article title has a significantimpact on frequently the paper is cited. We hypothesize that this maybe related to the way electronic searches of the literature areundertaken.

Herb, U. (2010)
AlternativeImpact Measures for Open Access Documents? An examination how togenerate interoperable usage information from distributed open accessservices
76th IFLA GeneralConference and Assembly, Gothenburg, Sweden, August 2010,paper available online 29 May 2010
From the Abstract: This contribution shows that most common methods toassess the impact of scientific publications often discriminate openaccess publications  and by that reduce the attractiveness of OpenAccess for scientists. Assuming that the motivation to use open accesspublishing services (e.g. a journal or a repository) would increase ifthese services would convey some sort of reputation or impact to thescientists, alternative models of impact are discussed. Prevailingresearch results indicate that alternative metrics based on usageinformation of electronic documents are suitable to complement or torelativize citation based indicators.

Giglia, E. (2010)
TheImpact Factor of Open Access journals: data and trends
DHANKEN, digital repository of HANKEN research, 27 May 2010.In14th InternationalConference on Electronic Publishing, Helsinki, 16-18 June2010. Slides in E-LIS, 21 June 2010 http://eprints.rclis.org/18669/
From the Abstract: The aim of this preliminary work, focused on Gold OpenAccess, is to test the performance of Open Access journals with themost traditional bibliometric indicator  Impact Factor, to verify thehypothesis that unrestricted access might turn into more citations andtherefore also good Impact Factor indices. Open Access journals arerelatively new actors in the publishing market, and gaining reputationand visibility is a complex challenge. Some of them show impressiveImpact Factor trends since their first year of tracking.

Calver, M. C. and Bradley, J. S. (2010)
Patterns of Citations of Open Access and Non-Open Access Conservation BiologyJournal Papers and Book Chapters
Conservation Biology,published online: 23 Apr 2010
From the abstract: We compared the number of citations of OA and non-OApapers in six journals and four books published since 2000 to testwhether OA increases number of citations overall and increasescitations made by authors in developing countries. After controllingfor type of paper (e.g., review or research paper), length of paper,authors' citation profiles, number of authors per paper, and whetherthe author or the publisher released the paper in OA, OA had nostatistically significant influence on the overall number of citationsper journal paper. Journal papers were cited more frequently if theauthors had published highly cited papers previously, were members oflarge teams of authors, or published relatively long papers, but paperswere not cited more frequently if they were published in an OA source.Nevertheless, author-archived OA book chapters accrued up to eighttimes more citations than chapters in the same book that were notavailable through OA, perhaps because there is no online abstractingservice for book chapters. There was also little evidence that journalpapers or book chapters published in OA received more citations fromauthors in developing countries relative to those journal papers orbook chapters not published in OA. For scholarly publications inconservation biology, only book chapters had an OA citation advantage,and OA did not increase the number of citations papers or chaptersreceived from authors in developing countries.

Pienta, Amy M., Alter, George C., Lyle, Jared A. (2010)
TheEnduring Value of Social Science Research: The Use and Reuse of PrimaryResearch Data
Deep Blue, University of Michigan, 22 Nov 2010. In The Organisation,Economics and Policy of Scientific Research workshop, Torino, April2010http://www.carloalberto.org/files/brick_dime_strike_workshopagenda_april2010.pdf
Abstract: The goal of this paper is to examine the extent to which social scienceresearch data are shared and assess whether data sharing affectsresearch productivity tied to the research data themselves. Weconstruct a database from administrative records containing informationabout thousands of social science studies that have been conducted overthe last 40 years. Included in the database are descriptions of socialscience data collections funded by the National Science Foundation andthe National Institutes of Health. A survey of the principalinvestigators of a subset of these social science awards was alsoconducted. We report that very few social science data collections arepreserved and disseminated by an archive or institutional repository.Informal sharing of data in the social sciences is much more common.The main analysis examines publication metrics that can be tied to theresearch data collected with NSF and NIH funding - total publications,primary publications (including PI), and secondary publications(non-research team). Multivariate models of count of publicationssuggest that data sharing, especially sharing data through an archive,leads to many more times the publications than not sharing data. Thisfinding is robust even when the models are adjusted for PIcharacteristics, grant award features, and institutionalcharacteristics.

Habibzadeh, F. and Yadollahie, M. (2010)
AreShorter Article Titles More Attractive for Citations? Cross-sectionalStudy of 22 Scientific Journals
Croatian Medical Journal,51 (2), April 2010
Openaccess is not the only factor affecting citation impact. Here isanother factor that has received rather less attention. From theAbstract: Longer titles seem to be associated with higher citationrates. This association is more pronounced for journals with highimpact factors. Editors who insist on brief and concise titles shouldperhaps update the guidelines for authors of their journals and havemore flexibility regarding the length of the title.

Agerbæk, A. and Nielsen, K. (2010)
Factorsin Open Access which Influence the Impact Cycle
ScieCom info, Vol 6, No 1, 2010 (issue notice posted 22 March 2010)
Short paper illustrating journal publishing flowcharts for non-openaccess (OA), gold OA and green OA, showing why, in principle, openaccess might lead to higher citations due to wider and earlierdissemination.

Wagner, A. B. (2010)
OpenAccess Citation Advantage: An Annotated Bibliography
Issues in Science and Technology Librarianship, No. 60, Winter 2010(issue notice posted 16 March 2010)
The bibliography is divided into three sections:
- Review articles [5 reviews]
- Studies showing an open access citation advantage (OACA) [39 articles]
- Studies showing either no OACA effect or ascribing OACA to factorsunrelated to OA publication [7 articles]
The following databases were searched ... results were cross-checkedagainst an extensive, more general bibliography (this bibliography). Itis interesting to note that no study has ever claimed that OA articleswere cited less than TA articles. The research question still beingdebated is whether other factors explain the widely observed OACA (OpenAccess Citation Advantage) rather than the mere fact an article is openaccess.

Snijder, R. (2010)
Theprofits of free books - an experiment to measure the impact of OpenAccess publishing
Google sites, undated, but first spotted in the wild 23 February 2010. InLearned Publishing, Vol. 23, No. 4, October 2010, 293-301
Abstract: to measure the impact of Open Access (OA) publishing ofacademic books, an experiment was set up. During a period of ninemonths three sets of books were disseminated through an institutionalrepository, the Google Book Search program or both channels. A fourthset was used as control group. Open Access publishing enhancesdiscovery and online consultation. No relation could be found betweenOA publishing and citation rates. Contrary to expectations, OApublishing does not stimulate or diminish sales figure. The Google BookSearch program is superior compared to the repository.

Swan, A. (2010)
The OpenAccess citation advantage: Studies and results to date
ECS EPrints, 17 Feb 2010
Abstract: presents a summary of reported studies on the Open Accesscitation advantage. There is a brief introduction to the main issuesinvolved in carrying out such studies, both methodological andinterpretive. The study listing provides some details of the coverage,methodological approach and main conclusions of each study.

Giglia, E. (2010)
Più citazioni inOpen Access? Panorama della letteratura con uno studio sull'ImpactFactor delle riviste Open Access
E-LIS, 21 Jan 2010, also in CIBER 1999-2009, 2009 (Ledizioni),pp. 125-145
Fromthe English abstract: This work aims to frame the international debateon the advantage citation of articles published in Open Access, thenpresent and discuss the overall data sull'Impact Factor of open accessjournals, the result of an original study conducted in the JournalCitation Reports (Thomson Reuters). The basic idea is to test theperformance of OA journals according to the traditional bibliometricindicators dell'Impact Factor, in order to test the hypothesis thatunrestricted access may involve a greater number of citations and,therefore, also a good impact factor. The results seem to confirm: the38,62% of Open Access journals included in the "Journal CitationReports" is positioned in the first five percentile as an indicatorwhen considering the Impact Factor. If you use the Immediacy Index isthe percentage is 37.16%, while the second new indicator dell'ImpactFactor over 5 years - however, only applies to 356 titles on 479 - thepercentage rises to 40.05%.

Davis, P. (2009)
Studies onaccess: a review
arXiv:0912.3953v1 [cs.DL], 20 Dec 2009
Briefabstract: A review of the empirical literature on access to scholarlyinformation. This review focuses on surveys of authors, articledownload and citation analysis.

Ibanez, A., Larranaga, P. and Bielza, C. (2009)
Predictingcitation count of Bioinformatics papers within four years of publication
Bioinformatics, 25 (24), 3303-3309, 15 December 2009
info:pmid/19819886 | info:doi/10.1093/bioinformatics/btp585
Fromthe abstract: "The possibility of a journal having a tool capable ofpredicting the citation count of an article within the first few yearsafter publication would pave the way for new assessment systems.Results: This article presents a new approach based on building severalprediction models for the Bioinformatics journal. These models predictthe citation count of an article within 4 years after publication(global models). To build these models, tokens found in the abstractsof Bioinformatics papers have been used as predictive features, alongwith other features like the journal sections and 2-weekpost-publication periods." Comment:Bioinformatics is not an openaccess journal, so these results are not based on data for open accesspapers, but they may have parallels with methods for predictingcitations and impact based on usage of OA papers (e.g.Brody et al., 2005). Data on which the results are based can be found at theauthors'site. Withoutaccess to the full paper it is not clear what predictive features arebeing applied to achieve the claimed successful results: "In these newmodels, the average success rate for predictions using the naive Bayesand logistic regression supervised classification methods was 89.4% and91.5%, respectively, within the nine sections and for 4-year timehorizon."

Rand, D. G. and Pfeiffer, T. (2009)
Systematic Differences in Impact across Publication Tracks at PNAS
PLoS ONE,4(12): e8092, December 1, 2009
Investigatescitation counts for the three different publication tracks of theProceedings of the National Academy of Sciences (PNAS). Open access is used as a control factor in theanalysis; "To empirically investigate the impact of papers publishedvia each track, we inspect 2695 papers published between June 1, 2004and April 26, 2005, covering PNAS Volume 101 Issue 22 through Volume102 Issue 17. For each paper, we examine Thomson Reuters Web of Sciencecitation data as of October 2006 and May 2009, as well as page-viewcounts as of October 2006. We also note the track through which eachpaper was published, the topic classification of each paper, the dateof publication, and whether each article was published as open accessand/or as part of a special feature." In quantifying the size of the OAeffect to control, the paper found that "similar to previousobservations, Open Access papers receive approximately 25% morecitations than non-Open Access papers (Median 2006 [2009] citations:Open access = 12.5 [38], non-Open access = 10 [30]; 75% percentile 2006[2009] citations: Open access = 21 [61], non-Open access = 17 [50])."

Soong, S. (2009)
Measuring Citation Advantages of Open Accessibility
D-Lib Magazine,15 (11/12), December 2009
Shortstudy of a small collection of papers deposited after publication inthe institutional repository of the Hong Kong University of Science andTechnology (HKUST). "A total of 50 archived journal articles thatalready have 10 or more citation counts in Scopus were randomlyselected for inclusion in this study." Claims to present an"easy-to-follow framework for citation impact analysis of openaccessibility. This framework allows for direct measurement andcomparison of citation rates before and after journal articles are madeopenly available." The method compares the citation performance of thesame article over time pre- and post-open access rather than, as otherstudies of OA impact, comparing open access papers with non-open accesspapers from the same source.

Poor, N. (2009)
Global Citation Patterns of Open Access Communication Studies Journals:Pushing Beyond the Social Science Citation Index
International Journal ofCommunication, Vol. 3, 2009
From the abstract: Connectivity and citations, as used by a largenumber of scholars in different fields, are a common measure of thehealth of a discipline. This paper shows the citation patterns for amultinational sample of open access journals in Communication Studies.Their citations are similar to those of the main communicationjournals, but with more international citations. Differences in thecitation patterns are attributable to the international nature of thesampled journals, not to their open access status. From the conclusion:The citation pattern of these open access journals is the same as thatfor non-open access journals, which is how it should be if open accessjournals are going to be of the same quality as more established,non-open access journals (recall that open access does requirepeer-review). The journals in the sample are not in a separate citationspace, and they take part in the larger conversation of the field. Assuch, this indicates, to a certain extent, the health of these journals(they are not isolates in the citing direction), which, in turn, is adecent indicator for the health of the field.

Akre, O., Barone-Adesi, F., Pettersson, A., Pearce, N., Merletti, F., and Richiardi, L. (2009)
Differencesin citation rates by country of origin for papers published intop-ranked medical journals: do they reflect inequalities in access topublication?
Journal of Epidemiology and Community Health, 24Nov 2009
info:pmid/19934169 | info:doi/10.1136/jech.2009.088690
Investigatesthe connection between citations and access, but not open access, bycountry and income with reference to the country of publication. Fromthe abstract: "Methods: We obtained the number of citations and thecorresponding authors country for 4724 papers published between 1998and 2002 in the British Medical Journal, the Lancet, Journal of theAmerican Medical Association, New England Medical Journal. Countrieswere grouped according to the World Bank classification and geographiclocation: low-middle income countries, European high-income countries,non-European high-income countries, UK and USA. Conclusions: Papersfrom different countries published in the same journal have differentcitation rates."

Mertens, S. (2009)
Open Access: Unlimited Web Based Literature Searching
Dtsch Arztebl Int., 106(43): October 23, 2009, 710-712
Reviews the findings of most of the principal papers found in thisbibliography

Kousha, K. and Abdoli, M. (2009)
Thecitation impact of Open Access Agricultural Research: a comparison between OAand Non-OA publications (pdf 12pp)
World Library And Information Congress: 75th IFLA General Conference andCouncil, 23-27 August 2009, Milan, Italy. Also inOnline Information Review, Vol. 34, No. 5, 2010, 772-785 http://dx.doi.org/10.1108/14684521011084618
Blogged summary,OpenAccess enhances accessibility and citation impact, InternationalAssociation of Agricultural Information Specialists, 13 July 2009: "The resultsshowed that there is an obvious citation advantage for self-archivedagriculture articles as compared to non-OA articles." - "results indicate thatself-archived research articles published in the non-OA agriculture journalscould attract nearly two times more citations than their non-OA counterparts."

Lariviere, V. and Gingras, Y. (2009)
The impact factor's Matthew effect: anatural experiment in bibliometrics
arXiv.org, arXiv:0908.3177v1 [physics.soc-ph], 21 Aug 2009, also inJournal of the American Society For Information Science And Technology, 61 (2): 424-427, February 2010
Makes no mention of open access impact, but presents some interesting parallelresults on journal impact factors, in this case that publication in higherimpact journals can result in higher citations for a given paper.

Asif-ul Haque and Ginsparg, P. (2009)
Positional Effects on Citation andReadership in arXiv
arXiv.org, arXiv:0907.4740v1 [cs.DL], 27 Jul 2009
inJournal of the American Society for Information Science andTechnology, Vol. 60 No. 11, 2203 - 2218, published online: 22 Jul 2009

Greyson, D., Morgan, S., Hanley, G. and Wahyuni, D. (2009)
Open access archiving and articlecitations within health services and policy research
E-LIS, 14 Jul 2009, inJournal of theCanadian Health Libraries Association (JCHLA) / Journal de l'Association desbibliothèques de la santé du Canada (JABSC), 2009, vol. 30, no. 2,51-58
From the abstract: This paper contributes to growing body of research exploringthe “OA advantage” by employing an article-level analysis comparingcitation rates for articles drawn from the same, purposively selected journals.We used a two-stage analytic approach designed to test whether OA is associatedwith (1) likelihood that an article is cited at all and (2) total numbercitations that an article receives, conditional on being cited at least once.Adjusting for potential confounders: number of authors, time since publication,journal, and article subject, we found that OA archived articles were 60% morelikely to be cited at least once, and, once cited, were cited 29% more thannon-OA articles.This paper contributes to growing body of research exploringthe “OA advantage” by employing an article-level analysis comparingcitation rates for articles drawn from the same, purposively selected journals.We used a two-stage analytic approach designed to test whether OA is associatedwith (1) likelihood that an article is cited at all and (2) total numbercitations that an article receives, conditional on being cited at least once.Adjusting for potential confounders: number of authors, time since publication,journal, and article subject, we found that OA archived articles were 60% morelikely to be cited at least once, and, once cited, were cited 29% more thannon-OA articles.
See also thisposter (1pp) withthe same title, E-LIS, 14 Jul 2009, inCanadian Health Libraries Association/ Associationdes bibliothèques de la santé du Canada (CHLA/ ABSC) Conference 2009,Winnepeg, Manitoba (Canada), May 30 - June 3, 2009

Joint,N. (2009)
The Antaeus column: doesthe “open access” advantage exist? A librarian's perspective
Library Review, Vol. 58, No. 7, 2009, 477-481
From the summary: Findings – The paper finds that many of the originalarguments for the benefits of open access have fallen by the wayside; but that,in spite of this, there is a good evidence that an “open accessadvantage” does exist. The application of straightforward librarystatistical counting measures which are traditionally used to evaluate userbenefits of mainstream services is just as effective an evaluation tool as moresophisticated citation analysis methods.

Gentil-Beccot, A., Mele, S., Brooks, T. (2009)
Citing and Reading Behaviours inHigh-Energy Physics. How a Community Stopped Worrying about Journals andLearned to Love Repositories
arXiv.org, arXiv:0906.5418v1 [cs.DL], v1, 30 Jun 2009
From the abstract: The analysis of citation data demonstrates that free andimmediate online dissemination of preprints creates an immense citationadvantage in HEP, whereas publication in Open Access journals presents nodiscernible advantage. In addition, the analysis of clickstreams in the leadingdigital library of the field shows that HEP scientists seldom read journals,preferring preprints instead.

Lansingh, V. C. and Carter, M. J. (2009)
DoesOpen Access in Ophthalmology Affect How Articles are Subsequently Cited inResearch? (abstract only, subscription required)
Ophthalmology, 116(8):1425-1431, August 2009, availableonline 22 June 2009Scintilla
From the abstract: Examination of 480 articles in ophthalmology in theexperimental protocol and 415 articles in the control protocol. ... Foursubject areas were chosen to search the ophthalmology literature in the PubMeddatabase ... Searching started in December of 2003 and worked back in time tothe beginning of the year. The number of subsequent citations for equal numbersof both open access (OA) and closed access (CA) (by subscription) articles wasquantified using the Scopus database and Google search engine. A controlprotocol was also carried out to ascertain that the sampling method was notsystematically biased by matching 6 ophthalmology journals (3 OA, 3 CA) usingtheir impact factors, and employing the same search methodology to sample OAand CA articles. The total number of citations was significantly higher foropen access articles compared to closed access articles for Scopus. However,univariate general linear model (GLM) analysis showed that access was not asignificant factor that explained the citation data. Author number,country/region of publication, subject area, language, and funding were thevariables that had the most effect and were statistically significant. Controlprotocol results showed no significant difference between open and closedaccess articles in regard to number of citations found by Scopus ... Unlikeother fields of science, open access thus far has not affected howophthalmology articles are cited in the literature.

Ostrowska, A. (2009)
Open Access Journals Quality– How to Measure It?
INFORUM 2009: 15th Conference on Professional Information Resources,Prague, May 27-29, 2009

Lin,S.-K. (2009)
Full Open Access JournalsHave Increased Impact Factors (editorial)
Molecules, 2009, 14(6):2254-2255

Mukherjee, B. (2009)
The hyperlinking patternof open-access journals in library and information science: A cited citingreference study
Library & Information ScienceResearch, 31 (2), April 2009, 113-125
This paper appears to be another take onthis study

Tiwari, A. (2009)
CitationTrend Line For PLoS Journals
Fisheye Perspective blog, April 25, 2009
A short illustrated blog on predicting the impact of a new journal. The author,a bioscientist, evaluates two PLoS (OA) journals using Scopus Journal Analyzer.Using the service's Trend Line and % Not Cited parameters the author predictsthat one, a new journal that doesn't yet have an official impact factor, willsoon rival the other, which does: "I am sure it's impact factor (or quality orwhat ever you love) is going to be same or may be much more." Does not claim tobe statistically sound.

Gargouri, Y. and Harnad, S. (2009)
Logisticregression of potential explanatory variables on citation counts
Preprint 11/04/2009
Logistic regression analysis on the correlation between citation counts (asdependent variable) and a set of potential correlator/predictor variables.
Result: Published journal papers that are self-archived in institutionalrepositories - in this study the repositories mandate deposit, obviating theself-selection bias postulated by some to be a factor in self-archiving - canachieve a citation advantage whether published in journals of high and lowimpact factor (IF): "Overall, OA is correlated with a significant citationadvantage for all journal IF intervals".

Watson, A. B. (2009)
Comparing citations and downloadsfor individual articles
Journal of Vision, April 3, 2009 Volume 9, Number 4, Editorial i, Pages1-4
Measures the correlation between downloads and citations counts for articles inJournal of Vision: "Download statistics provide a useful indicator, twoyears in advance, of eventual citations."

Bollen, J., Van de Sompel, H., Hagberg, A., Bettencourt, L.,Chute, R., Rodriguez, M. A. and Balakireva, L. (2009)
Clickstream DataYields High-Resolution Maps of Science
PLoS ONE, 4(3): e4803, March 11, 2009Scintilla
See alsoNaturenews article on this paper, 9March 2009: "A striking difference in the usage maps is that journals in thehumanities and social sciences figure much more prominently than incitation-based maps. The difference partly arises because Bollen's study coversa wider literature than the citation databases, which are biased towardsnatural sciences journals. "By including practitioners we capture a much widersample of the scholarly community," adds Bollen. Usage maps are also more up todate than citation ones because the inherent delay in publication means ittakes at least two years before a paper will start to gather citations insufficient numbers to be meaningful. Anthony van Raan argues that this morecurrent view may in fact represent today's "fashions", rather than trends thatwill endure."
These findings are not based on OA journals or papers, but highlight theemerging value of clicks, or hits, as possible contributory factors for onlineimpact metrics.

Bernius, S. and Hanauske, M. (2009)
OpenAccess to Scientific Literature - Increasing Citations as an Incentive forAuthors to Make Their Publications Freely Accessible
Institute for Information Systems, Frankfurt University, publications 2009, in42nd Hawaii International Conference onSystem Sciences (HICSS '09), 5-8 Jan. 2009, pp. 1-9http://dx.doi.org/10.1109/HICSS.2009.335
Summary of results from also in
Bernius, S., Hanauske, M., König, W. and Dugall, B. (2009)
OpenAccess Models and their Implications for the Players on the ScientificPublishing Market (see section 1.2)
Economic Analysis and Policy Journal, Vol. 39, No. 1, March 2009

Åström, F. (2009)
Citationpatterns in open access journals
OpenAccess.se and the National Library of Sweden, February 25, 2009.
"Fewer analyses have investigated whether OA and non-OA journals in the sameresearch fields are citing the same literature; and to what extent thisreflects whether it is the same kind (and thus comparable) research that ispublished in the two forms of scholarly publications. ... The citationstructures in the journals were analysed through MDS maps building onco-citation analyses, as well as a more thorough comparison investigatingoverlaps of cited authors and journals between the different journals. ... Theresults of the analyses suggests that it is hard to draw any overallconclusions on the matter of whether research published in OA journals islikely to have a larger citation impact or not."
This conclusion is unsurprising since the study did not measure impact butmapped citation patterns between journals. It is suggested that these mappingscould improve understanding when comparing the impact of OA and non-OAjournals.

Gaulé, P. (2009)
Accessto the scientific literature in India
CEMI Working Paper 2009-004, February 23, 2009, in In Journal of the American Society for Information Science and Technology, Vol. 60, Issue 12, 2548 - 2553, published online: 8 Oct 2009
Abstract: This paper uses an evidence-based approach to assess the difficultiesfaced by developing country scientists in accessing the scientific literature.I compare backward citations patterns of Swiss and Indian scientists in adatabase of 43'150 scientific papers published by scientists from eithercountry in 2007. Controlling for fields and quality with citing journal fixedeffects, I find that Indian scientists (1) have shorter references lists (2)are more likely to cite articles from open access journals and (3) are lesslikely to cite articles from expensive journals. The magnitude of the effectsis small which can be explained by informal file sharing practices amongscientists.
See also
Patrick Gaulé and Nicolas Maystre,Freeavailability and diffusion of scientific articles, Vox, 23 June 2009

Evans, J. A. and Reimer, J. (2009)
Open Access and GlobalParticipation in Science (full text requires subscription; summary only)
Science, Vol. 323. No. 5917, 20February 2009, 1025Scintilla
From the paper: "The influence of OA is more modest than many have proposed, at c.8% for recently published research, but our work provides clear support forits ability to widen the global circle of those who can participate in scienceand benefit from it."
Listen toSciencepodcastinterview with James Evans
See also articles on this paper:
Dolgin, E.,Online access = morecitations,The Scientist, 19thFebruary 2009 (free registration required): "When the authors looked just atpoorer countries, however, they found that the influence of open access wasmore than twice as strong. For example, in Bulgaria and Chile, researcherscited nearly 20% more open access articles, and in Turkey and Brazil, thenumber of citations rose by more than 25%. Free online availability "is not ahuge driver of science in the first world, but it shapes parts of science inthe rest of world," Evans toldTheScientist."
Xie, Y.,Open,electronic access to research crucial for global reach,ars technica, February 19, 2009

Bollen, J., Van de Sompel, H., Hagberg, A. and Chute, R.(2009)
A principal component analysis of 39scientific impact measures
arXiv.org, arXiv:0902.2183v1 [cs.CY], 12 Feb. 2009, inPLoS ONE 4(6):e6022, http://dx.doi.org/10.1371/journal.pone.0006022

Castillo, M. (2009)
Citations and OpenAccess: Questionable Benefits
American Journal of Neuroradiology,February 2009Scintilla
An editorial.
(Ed. For the record, but I cannotindicate what is questionable about OA, in this author's view, as I can'taccess any part of this, not even an abstract.)

Norris, M. (2009)
The citation advantage of open accessarticles
PhD thesis, Loughborough University Institutional Repository, 2009-01-15
Michael Norris has been named Highly Commended Award winner of the 2008Emerald/EFMD Outstanding Doctoral Research Award in the Information Sciencecategory for this doctoral thesis.
Two published papers (JASIST,ElPub) are based on this work.

Frandsen, T. F. (2009)
The effects of openaccess on un-published documents: A case study of economics working papers
HAL: hprints-00352359, version 2, 12 January 2009,Journal ofInformetrics (2009) in press

O'Leary, D. E. (2008)
The relationship betweencitations and number of downloads
Decision Support Systems, Vol. 45, No. 4, November 2008, 972-980,available online 11 April 2008 (full text requires subscription; abstract only)
Broadly agrees with earlier findings (e.g.Brodyet al.) about the correlation - 'strong positive statisticallysignificant relationship' - between downloads and citations for digital papers,notably for the most-downloaded, 'top' papers, in this case based on data for asingle, focussed source, the journalDecision Support Systems.

Mukherjee, B. (2008)
Do open-access journals inlibrary and information science have any scholarly impact? A bibliometric studyof selected open-access journals using Google Scholar (full text requiressubscription; abstract only)
Journal of the American Society for Information Science and Technology,Vol. 60, No. 3, March 2009, 581-594, published online: 16 Dec 2008
From the abstract: "Using 17 fully open-access journals publisheduninterruptedly during 2000 to 2004 in the field of library and informationscience, the present study investigates the impact of these open-accessjournals in terms of quantity of articles published, subject distribution ofthe articles, synchronous and diachronous impact factor, immediacy index, andjournals' and authors' self-citation."
The paper does not appear to reveal any comparative findings (OA vs non-OA).

Tenopir, C. and King, D. W. (2008)
Electronic Journalsand Changes in Scholarly Article Seeking and Reading Patterns
D-Lib Magazine, Vol. 14 No. 11/12,November/December 2008
From the abstract: "Reading patterns and citation patterns differ, as facultyread many more articles than they ultimately cite and read for many purposes inaddition to research and writing. The number of articles read has steadilyincreased over the last three decades, so the actual numbers of articles foundby browsing has not decreased much, even though the percentage of readingsfound by searching has increased. Readings from library-provided electronicjournals has increased substantially, while readings of older articles haverecently increased somewhat. Ironically, reading patterns have broadened withelectronic journals at the same time citing patterns have narrowed."



Gaule, P. and Maystre, N. (2008)
Gettingcited: does open access help?
Ecole Polytechnique Fédérale de Lausanne, CEMI-WORKINGPAPER-2008-007, November2008. InResearch Policy, in press, available online 2 July 2011 info:doi/10.1016/j.respol.2011.05.025. Also available from RePEc http://ideas.repec.org/p/cmi/wpaper/cemi-workingpaper-2008-007.html
Explains the 'widely held belief' that free availability of scientific articlesincreases the number of citations they receive thus: "Since open access isrelatively more attractive to authors of higher quality papers, regressingcitations on open access and other controls yields upward-biased estimates."Findings are based on a sample of 4388 biology papers published between May2004 and March 2006 byProceedings of the National Academy of Sciences(PNAS). "Using an instrumental variable approach, we find no significant effectof open access. Instead, self-selection of higher quality articles into openaccess explains at least part of the observed open access citation advantage."Note, OA in PNAS is itself self-selective by virtue of its OA chargingstructure (i.e. payment is required for the published article to be OA, but notif it is not OA).
See also
Patrick Gaulé and Nicolas Maystre,Freeavailability and diffusion of scientific articles, Vox, 23 June 2009

Harnad, S.,GettingExcited About Getting Cited: No Need To Pay For OA,Open Access Evangelism, August 19. 2011: What G & M have shown,convincingly, is that in the special case of having to pay for OA in ahybrid Gold Journal (PNAS: a high-quality journal that makes allarticles OA on its website 6 months after publication), the articlequality and author self-selection factors alone (plus the availabilityof funds in the annual funding cycle) account for virtually all thesignificant variance in the OA citation advantage: Paying extra toprovide hybrid Gold OA during those first 6 months does not buy authorssignificantly more citations. G & M correctly acknowledge,however, that neither their data nor their economic model apply to Green OAself-archiving, which costs the author nothing and can be provided forany article, in any journal (most of which are not made OA on thepublisher's website 6 months after publication, as in the case ofPNAS). Yet it is on Green OA self-archiving that most of the studies ofthe OA citation advantage (and the ones with the largest and mostcross-disciplinary samples) are based.

Frandsen, T. F. (2008)
Attracted to open accessjournals: a bibliometric author analysis in the field of biology
Hprints, Nordic arts and humanities e-print archive, HAL: hprints-00328270,version 1, 10 October 2008
also inJournal of Documentation,January 2009http://www.emeraldinsight.com/Insight/viewContentItem.do?contentType=Article&contentId=1766883

Frandsen, T. F. (2008)
The integration of openaccess journals in the scholarly communication system: Three sciencefields
Hprints, Nordic arts and humanities e-print archive, HAL: hprints-00326285,version 1, 2 October 2008
also inInformation Processing &Management, January 2009 http://dx.doi.org/10.1016/j.ipm.2008.06.001
From the abstract: "This study is an analysis of the citing behaviour in (openaccess) journals within three science fields: biology, mathematics, andpharmacy and pharmacology. The integration of OAJs in the scholarlycommunication system varies considerably across fields. The implications forbibliometric research are discussed."

DeGroote, S. L (2008)
Citation patterns ofonline and print journals in the digital age
J. Med. Libr. Assoc., 2008 October; 96(4): 362-369Scintilla
From the abstract: "Journals available in electronic format were cited morefrequently in publications from the campus whose library had a small printcollection, and the citation of journals available in both print and electronicformats generally increased over the years studied."

Kousha, K. (2008)
Characteristicsof Open Access Web Citation Network: A Multidisciplinary Study
Proceedings of WIS 2008, Fourth International Conference on Webometrics,Informetrics and Scientometrics & Ninth COLLNET Meeting (Berlin, 28July - 1 August 2008), edited by H. Kretschmer and F. Havemann, October 2008

Lariviere, V., Gingras, Y. and Archambault, E. (2008)
The decline in the concentration ofcitations, 1900-2007
arXiv.org, arXiv:0809.5250v1 [physics.soc-ph], 30 Sep 2008 and inJournal ofthe American Society for Information Science and Technology, Vol. 60, No.4, April 2009, 858-862, published online: 29 Jan 2009
From the abstract: "This paper challenges recent research (Evans, 2008) reporting that the concentration of citedscientific literature increases with the online availability of articles andjournals. ... contrary to what was reported by Evans, the dispersion ofcitations is actually increasing."


Davis, P. M.
Author-choice open access publishingin the biological and medical literature: a citation analysis
arXiv.org, arXiv:0808.2428v1 [cs.DL], 18 Aug 2008, inJournal of the American Society for Information Science & Technology, Vol. 60, No. 1, January 2009, 3-8, published online: 25 Sep 2008Scintilla
This study is a follow-up to thecontrolled trial ofopen access publishing published in theBMJ: "According to a studyof 11 biological and medical journals that allow authors the choice of makingtheir articles freely available from the publisher's website, few show anyevidence of a citation advantage. For those that do, the effect appears to bediminishing over time. ... (the paper) analyzed over eleven thousand articlespublished in journals since 2003, sixteen hundred of these articles (15%)adopting the author-choice open access model."


Clauson, K. A., Veronin, M. A., Khanfar, N. M. and Lou, J. Q.
Open-accesspublishing for pharmacy-focused journals (full text requires subscription;summary only)
American Journal of Health-System Pharmacy, Vol. 65, No. 16, 1539-1544,August 15, 2008Scintilla
From the conclusion. A very small number of pharmacy-focused journals adhere tothe OA paradigm of access. However, journals that adopt some elements of the OAmodel, chiefly free accessibility, may be more likely to be cited thantraditional journals. Pharmacy practitioners, educators, and researchers couldbenefit from the advantages that OA offers but should understand its financialdisadvantages.
The same issue has an editorial by C. Richard Talley,Open-access publishing:why not? accessible only to subscribers


Henneken, E. A., Kurtz, M. J., Accomazzi, A., Grant, C. S.,Thompson, D., Bohlen, E. and Murray, S. S. (2008)
Use of Astronomical Literature - AReport on Usage Patterns
arXiv.org, arXiv:0808.0103v1 [cs.DL], 1 Aug 2008
inJournal of Informetrics, Vol. 3, Issue 1, 1-90 (January 2009)

Davis, P.M., Lewenstein, B. V., Simon, D. H., Booth, J. G.and Connolly, M. J. L. (2008)
Open accesspublishing, article downloads, and citations: randomised controlled trial
BMJ, 2008;337:a568, published 31 July 2008Scintilla
See also this update by Philip M Davis:
Resultsof Open Access RTC Robust at 3 Years,BMJ Rapid Response,23 November 2010: "All of the articles in our study have now aged 3-years and we report that our initial findings were robust: articles receiving the open access treatment received more article downloads but no more citations"






Levitt, J. M. and Thelwall, M. (2008)
Patterns of annualcitation of highly cited articles and the prediction of their citation ranking:A comparison across subjects (full text requires subscription; abstractonly)
Scientometrics, Vol. 77, No. 1 (2008)41-46, published online: 24 July 2008
From the abstract: "For four of the six subjects, there is a correlation ofover 0.42 between the percentage of early citations and total citation rankingbut more highly ranked articles had a lower percentage of early citations.Surprisingly, for highly cited articles in all six subjects the prediction ofcitation ranking from the sum of citations during their first six years wasless accurate than prediction using the sum of the citations for only the fifthand sixth year."
Open access is not a factor here. The highly-cited subject articlesinvestigated here date from 1969-71. For open access papers,Brody et al. (2005) revealed a correlation topredict impact from much earlier data, i.e. download data for OA papers, beforeany citations.

Evans, J. A. (2008)
Electronic Publication andthe Narrowing of Science and Scholarship (full text requires subscription;abstract only)
Science, Vol. 321, No. 5887, 18 July2008, 395-399Scintilla
From the abstract: "Using a database of 34 million articles, their citations(1945 to 2005), and online availability (1998 to 2005), I show that as morejournal issues came online, the articles referenced tended to be more recent,fewer journals and articles were cited, and more of those citations were tofewer journals and articles. The forced browsing of print archives may havestretched scientists and scholars to anchor findings deeply into past andpresent scholarship. Searching online is more efficient and followinghyperlinks quickly puts researchers in touch with prevailing opinion, but thismay accelerate consensus and narrow the range of findings and ideas builtupon."
This work was funded by the National Science Foundation. See the NSFpressrelease and video interview with James Evans
See also the news featureGreatminds think (too much) alike,TheEconomist, July 17th 2008







Norris, M., Oppenheim, C., and Rowland, F.
Thecitation advantage of open-access articles (full text requiressubscription; abstract only)
Journal of the American Society forInformation Science and Technology, Vol. 59, No. 12, 2008, 1963-1972,published online: 9 July 2008
also available from Loughborough University Institutional Repository,2009-01-12 http://hdl.handle.net/2134/4083
From the abstract: "Of a sample of 4,633 articles examined, 2,280 (49%) were OAand had a mean citation count of 9.04 whereas the mean for (toll access) TAarticles was 5.76. There appears to be a clear citation advantage for thosearticles that are OA as opposed to those that are TA. This advantage, however,varies between disciplines, with sociology having the highest citationadvantage, but the lowest number of OA articles, from the sample taken, andecology having the highest individual citation count for OA articles, but thesmallest citation advantage. Tests of correlation or association between OAstatus and a number of variables were generally found to weak or inconsistent.The cause of this citation advantage has not been determined."

Eger, A. (2008)
Databasestatistics applied to investigate the effects of electronic informationservices on publication of academic research - a comparative study coveringAustria, Germany and Switzerland
GMS Medizin - Bibliothek - Information, June 26, 2008
Findings on increased usage of online full text articles leading to increasedpublication, but says nothing on the effects of such access on citationpractices

Norris, M., Oppenheim, C. and Rowland, F. (2008)
OpenAccess Citation Rates and Developing Countries
12th International Conference on Electronic Publishing (ElPub 2008),Toronto, June 25-27, 2008
"the admittedly small number of citations from authors in developing countriesdo indeed seem to show a higher proportion of citations being given to OAarticles than is the case for citations from developed countries."

Sheikh Mohammad, S.
Researchimpact of open access contributions across disciplines
12th International Conference on Electronic Publishing (ElPub 2008),Toronto, June 25-27, 2008

Dietrich, J. P. (2008)
Disentangling visibility andself-promotion bias in the arXiv: astro-ph positional citation effect
arXiv.org, arXiv:0805.0307v2 [astro-ph], 25 Jun 2008, inPublications of theAstronomical Society of the Pacific, 120 (869): 801-804


Cheng,W. H. and Ren, S. L. (2008)
Evolutionof open access publishing in Chinese scientific journals (full textrequires subscription; abstract only)
Learned Publishing, Vol. 21, No. 2, April 2008, 140-152
From the abstract: "Citation indicators of OA journals were found to be higherthan those of non-OA journals."

Harnad, S., Brody, T., Vallières, F., Carr, L.,Hitchcock, S., Gingras, Y., Oppenheim, C., Hajjem, C. and Hilf, E. R. (2008)
The Access/ImpactProblem and the Green and Gold Roads to Open Access: An Update
Serials Review, Vol. 34, Issue 1,March 2008, 36-40, available online 6 March 2008Scintilla
also available from ECS EPrints, 06 Jun 2008http://eprints.ecs.soton.ac.uk/15852/
Update to thepaper published inSerials Review, 30(4), 2004

Lokker, C., McKibbon, K. A., McKinlay, R.J., Wilczynski, N. L. and Haynes, R.B. (2008)
Prediction ofcitation counts for clinical articles at two years using data available withinthree weeks of publication: retrospective cohort study
BMJ, 2008;336:655-657 (22 March), published 21 February 2008Scintilla
"Conclusion: Citation counts can be reliably predicted at two years using datawithin three weeks of publication."


Chu, H. and Krichel, T. (2008)
Downloads vs. Citations:Relationships, Contributing Factors and Beyond
E-LIS, 9 February 2008, in11th Annual Meeting of the International Societyfor Scientometrics and Informetrics, Madrid, 25-27 June 2007
From the abstract: "In a nutshell, an infrastructure that encouragesdownloading at digital libraries would eventually lead to higher usage of theirresources."

Turk,N. (2008)
Citation impact of OpenAccess journals (full text requires subscription; summary only)
New Library World, Vol. 109, No. 1/2, January/February 2008, 65-74
Review of the main research about citation impact of Open Access journals,focused on LIS journals.

Hardisty, D. J. and Haaga, D. A. F. (2008)
Diffusionof Treatment Research: Does Open Access Matter? (pdf 39pp)
Center for the Decision Sciences, Columbia University, inJournal ofClinical Psychology, Vol. 64(7), 1-19 (2008)
From the abstract: "In a pair of studies, mental health professionals weregiven either no citation, a normal citation, a linked citation, or a freeaccess citation and were asked to find and read the cited article. After oneweek, participants read a vignette on the same topic as the article and gaverecommendations for an intervention. In both studies, those given the freeaccess citation were more likely to read the article, yet only in one study didfree access increase the likelihood of making intervention recommendationsconsistent with the article."

Kousha, K. and Thelwall, M. (2007)
The Web impact of openaccess social science research (full-text requires subscription; otherwiseabstract only)
Library & Information Science Research, Volume 29, Issue 4, December2007, 495-507, available online 15 October 2007
preprinthttp://www.scit.wlv.ac.uk/~cm1993/papers/OpenAccessSocialSciencePreprint.doc(.doc 12pp)
From the abstract: "The results suggest that new types of citation informationand informal scholarly indictors could be extracted from the Web for the socialsciences."

Dietrich, J. P. (2007)
The Importance of Being First:Position Dependent Citation Rates on arXiv:astro-ph
arXiv.org, arXiv:0712.1037v1 [astro-ph], 6 December 2007, inPublications ofthe Astronomical Society of the Pacific, 120 (864): 224-228, February 2008
From the abstract: "We study the dependence of citation counts of e-printspublished on the arXiv:astro-ph server on their position in the daily astro-phlisting. ... cannot exclude that increased visibility at the top of the dailylistings contributes to higher citation counts as well."


Kurtz, M. J. and Henneken, E. A. (2007)
Open Access does not increasecitations for research articles from The Astrophysical Journal
arXiv.org, arXiv:0709.0896v1 [cs.DL], 6 September 2007
Abstract: We demonstrate conclusively that there is no "Open Access Advantage"for papers from the Astrophysical Journal. The two to one citation advantageenjoyed by papers deposited in the arXiv e-print server is due entirely to thenature and timing of the deposited papers. This may have implications for otherdisciplines.


Sotudeh, H. and Horri, A. (2007)
The citation performance of openaccess journals: A disciplinary investigation of citation distributionmodels (full-text subscribers only; no abstract)
Journal of the American Society for Information Science and Technology,Vol. 58, No. 13, 2007, 2145-2156, published online August 17, 2007
From the conclusion: "To sum up, the similarity of the science system acrossOAJ and NOAJ boundaries has been confirmed. We see this as further evidence ofOA's widespread recognition by scientific communities. However, because themagnitudes of the exponents found in this study are lower than what waspreviously observed for the whole system, OA may currently perform at aslightly lower level. According to the models used in this study, the citationdistributions between fields are strongly disproportionate in Life Sciences andEngineering and Material Sciences, favoring larger fields in the former, butsmaller fields in the latter. However, the distributions tend to be ratherlinear in the Natural Sciences."

Brody, T., Carr, L., Gingras, Y., Hajjem, C., Harnad, S. and Swan, A. (2007)
Incentivizingthe Open Access Research Web: Publication-Archiving, Data-Archiving andScientometrics
CTWatch Quarterly, Vol. 3, No. 3, August 2007

Lin, S.-K. (2007)
Editorial: Non-OpenAccess and Its Adverse Impact onMolecules
Molecules, 12, 1436-1437, 16 July 2007
The point of this short editorial is clear, that the difference between the OAand non-OA content in the journalMolecules is clearly reflected inhigher citations for the former. The context could be clearer, however. TheOA/non-OA history of the journal, especially prior to the period under review(2005-6), is not elaborated and familiarity with the journal is assumed.

Taylor, D. (2007)
Lookingfor a Link: Comparing Faculty Citations Pre and Post Big Deals
Electronic Journal of Academic and Special Librarianship, v.8 no.1(Spring 2007)
Note. The Big Deal is where a library or consortium of libraries subscribes toa larger package of a publisher's journals than they would have if they hadsubscribed to journals individually. Big Deals are claimed to improve accessfor an institution's users. "Pre Big Deal, the percentage of citations tojournals that are part of Big Deals but were previously not subscribed to wasan average of 2.6%. Post Big Deal this increased to an average of 6.1%." Thereis no analysis or comment on how this result might be affected if it wasconsidering open access.

Craig,I. D., Plume, A. M., McVeigh, M. E., Pringle, J. and Amin, M. (2007)
Do Open AccessArticles Have Greater Citation Impact? A critical review of the literature
Publishing Research Consortium, undated (announced 17 May 2007),Journal ofInformetrics, 1 (3): 239-248, July 2007



Tonta,Y., Ünal, Y. and Al, U. (2007)
The Research Impact ofOpen Access Journal Articles
E-LIS, 30 April 2007, also inProceedings ELPUB 2007, the 11th InternationalConference on Electronic Publishing, Vienna, 13-15 June 2007


Sharma, H. P. (2007)
Download plus citationcounts - a useful indicator to measure research impact (correspondence, pdf1pp)
Current Science, 92 (7): 873-873, April 10, 2007

Piwowar, H. A., Day, R. S. and Fridsma, D. B. (2007)
SharingDetailed Research Data Is Associated with Increased Citation Rate
PLoS ONE, March 21, 2007
Principal Findings: "We examined the citation history of 85 cancer microarrayclinical trial publications with respect to the availability of their data. The48% of trials with publicly available microarray data received 85% of theaggregate citations. Publicly available data was significantly (p = 0.006)associated with a 69% increase in citations, independently of journal impactfactor, date of publication, and author country of origin using linearregression."


Bergstrom, T. C. and Lavaty, R. (2007)
How often doeconomists self-archive?
eScholarship Repository, University of California, February 8, 2007


Chapman, S., Nguyen, T. N. and White, C. (2007)
Press-releasedpapers are more downloaded and cited (full text requires subscription;extract only)
Tobacco Control, 16 (1): 71-71, February 2007

Harnad, S. and Hajjem, C. (2007)
TheOpen Access Citation Advantage: Quality Advantage Or Quality Bias?
Author blog, Open Access Archivangelism, 21 January 2007
Does the OA Advantage (OAA) occur because authors are more likely toself-selectively self-archive articles that are more likely to be cited(self-selection "Quality Bias": QB), or because articles that are self-archivedare more likely to be cited ("Quality Advantage": QA)? Preliminary evidencebased on over 100,000 articles from multiple fields, comparing self-selectedself-archiving with mandated self-archiving to estimate the contributions of QBand QA to the OAA shows: "Both factors contribute, and the contribution of QAis greater." Includes comment on Moed, H. (2006),The effectof 'Open Access' upon citation impact: An analysis of ArXiv's Condensed MatterSection.


Harnad, S. (2007)
CitationAdvantage For OA Self-Archiving Is Independent of Journal Impact Factor,Article Age, and Number of Co-Authors
Author blog, Open Access Archivangelism, 17 January 2007
Further comment on Eysenbach, G. (2006),CitationAdvantage of Open Access Articles: "The OA-self-archiving advantage remainsa robust, independent factor."

Brody, T. (2007)
Evaluating Research Impactthrough Open Access to Scholarly Communication
PhD, Electronics and Computer Science, University of Southampton, May 2006, inECS EPrints, 14 January 2007

McDonald, J. D. (2007)
Understanding OnlineJournal Usage: A Statistical Analysis of Citation and Use
Journal of the American Society for Information Science &Technology, 58(1): 39-50, January 1, 2007, also in Caltech Library SystemPapers and Publications, 18 May 2006

Knowlton, S. A. (2007)
Continuinguse of print-only information by researchers
J Med Libr Assoc., 95(1): 83-88,January 2007
"to study the question, "Are researchers still accessing and using materialissued only in print?," a group of journals was selected, and the impact factorof each was tracked over the period 1993-2003.
Conclusion: the online status of a journal is not sufficient to override allother considerations by researchers when they choose which material to cite."

Walters, G. D. (2006)
Predictingsubsequent citations to articles published in twelve crime-psychology journals:Author impact versus journal impact (full text requires subscription;abstract only)
Scientometrics, 69 (3): 499-510, December 2006
"These results suggest that author impact may be a more powerful predictor ofcitations received by a journal article than the periodical in which thearticle appears."

Harnad, S. (2006)
TheSelf-Archiving Impact Advantage: Quality Advantage or Quality Bias?
Author blog, Open Access Archivangelism, 20 November 2006

Moed, H. F. (2006)
The effect of 'Open Access'upon citation impact: An analysis of ArXiv's Condensed Matter Section
ArXiv, Computer Science, cs.DL/0611060, 14 November 2006, inJournal of theAmerican Society for Information Science and Technology, Vol. 58, No. 13,2007, 2145-2156, published online August 30, 2007http://dx.doi.org/10.1002/asi.20663(subscriber access only to full text)
"This article statistically analyses how the citation impact of articlesdeposited in the Condensed Matter section of the preprint server ArXiv, andsubsequently published in a scientific journal, compares to that of articles inthe same journal that were not deposited in that archive. Its principal aim isto further illustrate and roughly estimate the effect of two factors, 'earlyview' and 'quality bias', upon differences in citation impact between these twosets of papers ... The analysis provided evidence of a strong quality bias andearly view effect. Correcting for these effects, there is in a sample of 6condensed matter physics journals studied in detail, no sign of a general 'openaccess advantage' of papers deposited in ArXiv. The study does provide evidencethat ArXiv accelerates citation, due to the fact that that ArXiv makes papersearlier available rather than that it makes papers freely available."




Bollen, J. and Van de Sompel, H. (2006)
Usage Impact Factor: the effects ofsample characteristics on usage-based impact metrics
arXiv.org > cs > arXiv:cs/0610154v2 [cs.DL], 26 October 2006, inJournal of the American Society for Information Science and Technology,59 (1): 136-149, January 1, 2008

Mayr, P. (2006)
Constructing experimentalindicators for Open Access documents
E-LIS, 05 October 2006, inResearch Evaluation, special issue on 'Webindicators for Innovation Systems', Vol. 15, No. 2, 1 August 2006, 127-132
Author preprint,http://www.ib.hu-berlin.de/~mayr/arbeiten/mayr_RE06.pdf(pdf 9pp)

Henneken, E. A., Kurtz, M. J., Warner, S., Ginsparg, P., Eichhorn, G.,Accomazzi, A., Grant, C. S., Thompson, D., Bohlen, E. and Murray, S. S. (2006)
E-prints and Journal Articles inAstronomy: a Productive Co-existence
ArXiv, Computer Science, cs.DL/0609126, 22 September 2006, inLearnedPublishing, Vol. 20, No. 1, January 2007, 16-22



Jacsó, P. (2006)
OpenAccess to Scholarly Full Text Documents (pdf 8pp)
Online Information Review
, 30(5) 2006, 587-594

Zhang, Y. (2006)
The Effect of Open Access onCitation Impact: A Comparison Study Based on Web Citation Analysis(abstract only)
Libri, September 2006 (Full text forsubscribers)

Kurtz, M. and Brody, T. (2006)
The impactloss to authors andresearch
e-Prints Soton, 12 July 2006, in Jacobs, N. (ed.),Open Access: Keystrategic, technical and economic aspects (Oxford, UK:ChandosPublishing)

Metcalfe, T. S. (2006)
The CitationImpact of Digital Preprint Archives for Solar Physics Papers
Solar Physics, Vol. 239, No. 1-2, December 2006, pp. 549-553
also in ArXiv, Astrophysics, astro-ph/0607079, 5 July 2006http://arxiv.org/abs/astro-ph/0607079
"Most astronomers now use the arXiv.org server (astro-ph) to distributepreprints, but the solar physics community has an independent archive hosted atMontana State University. For several samples of solar physics papers publishedin 2003, I quantify the boost in citation rates for preprints posted to each ofthese servers. I show that papers on the MSU archive typically have citationrates 1.7 times higher than the average of similar papers that are not postedas preprints, while those posted to astro-ph get 2.6 times the average. Acomparable boost is found for papers published in conference proceedings,suggesting that the higher citation rates are not the result of self-selectionof above-average papers."

Henneken, E. A., Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant, C.,Thompson, D., and Murray, S. S. (2006)
Effect of E-printingon Citation Rates in Astronomy and Physics
Journal of Electronic Publishing, Vol. 9, No. 2, Summer 2006, also inArXiv, Computer Science, cs.DL/0604061, v2, 5 June 2006http://arxiv.org/abs/cs/0604061
"It has been observed that papers that initially appear as arXiv e-prints getcited more than papers that do not. Using the citation statistics from theNASA-Smithsonian Astrophysics Data System, we confirm the findings from otherstudies, we examine the average citation rate to e-printed papers in theAstrophysical Journal, and we show that for a number of major astronomy andphysics journals the most important papers are submitted to the arXiv e-printrepository first.



Charlotte Tschider (2006)
Investigatingthe public in the Public Library of Science: Gifting economics in theInternet community
First Monday, vol 11, no 6, June 2006
Thanks to Chris Maloney @Klortho

Kousha, K. and Thelwall, M. (2006)
Google Scholar Citationsand Google Web/URL Citations: A Multi-Discipline Exploratory Analysis
E-LIS, 05 June 2006, also inProceedings International Workshop onWebometrics, Informetrics and Scientometrics & Seventh COLLNET Meeting,Nancy (France), May 2006
"we built a sample of 1,650 articles from 108 Open Access (OA) journalspublished in 2001 in four science and four social science disciplines. Werecorded the number of citations to the sample articles using several methodsbased upon the ISI Web of Science, Google Scholar and the Google search engine(Web/URL citations). For each discipline, we found significant correlationsbetween ISI citations and both Google Scholar and Google Web/URL citations;with similar results when using total or average citations, and when comparingwithin and across (most) journals."

Eysenbach, G. (2006)
Citation Advantageof Open Access Articles
PLoS Biology, Volume 4, Issue 5, May 2006Scintilla
Further evidence for the OA citation advantage, although quite critical ofother studies with which its findings broadly agree. This example is based on asmall, single journal sample (PNAS: Proceedings of the National Academy ofSciences). Since PNAS offers authors the choice of paying to provide openaccess to published papers and/or freely self-archiving, a 'Secondary analysis'considers the relative impact of each type of OA, although the number of papersinvolved is really too small to give this result the weight of the broaderfindings. The paper is accompanied by two editorials, one in the publishingjournal, the other a self-published editorial by the author:
MacCallum, C. J. and Parthasarathy, H. (2006)Editorial: CitationAdvantage of Open Access Articles,PLoS Biology, Volume 4, Issue 5,May 2006
Eysenbach, G. (2006)The Open AccessAdvantage,Journal of Medical Internet Research, 2006;8(2):e8










Davis, P. M. and Fromerth, M. J. (2006)
Does the arXiv lead to highercitations and reduced publisher downloads for mathematics articles? (pdf12pp)
draft manuscript, ArXiv.org, cs.DL/0603056, 14 March 2006,Scientometics, Vol. 71, No. 2. (May 2007)




Harnad, S. (2006)
OA Impact Advantage = EA + (AA)+ (QB) + QA + (CA) + UA
Author eprint, 14 March 2006, ECS EPrints repository, School of Electronics andComputer Science, University of Southampton

Mueller, P. S., Murali, N. S., Cha, S. S., Erwin, P. J. and Ghosh, A. K. (2006)
The effectof online status on the impact factors of general internal medicinejournals
Netherlands Journal of Medicine, 64 (2): 39-44, February 2006
"becoming available online as FUTON (full text on the Net) is associated with asignificant increase in journal impact factor."

Hajjem, C., Harnad, S. and Gingras, Y. (2005)
Ten-YearCross-Disciplinary Comparison of the Growth of Open Access and How it IncreasesResearch Citation Impact (pdf 8pp)
IEEE Data Engineering Bulletin, Vol. 28 No. 4, December 2005
also Author eprint, 16 December 2005http://eprints.ecs.soton.ac.uk/11688/
"In 2001, Lawrence found that articles in computer science that were openlyaccessible (OA) on the Web were cited substantially more than those that werenot. We have since replicated this effect in physics. To further test itscross-disciplinary generality, we used 1,307,038 articles published across 12years (1992-2003) in 10 disciplines (Biology, Psychology, Sociology, Health,Political Science, Economics, Education, Law, Business, Management). Theoverall percentage of OA (relative to total OA + NOA) articles varies from5%-16% (depending on discipline, year and country) and is slowly climbingannually. Comparing OA and NOA articles in the same journal/year, OA articleshave consistently more citations, the advantage varying from 25%-250% bydiscipline and year."

Hajjem, C., Gingras, Y., Brody, T., Carr, L. and Harnad, S. (2005)
Open Access to ResearchIncreases Citation Impact (.doc 12pp)
Author eprint, 16 December 2005, Technical Report, Institut des sciencescognitives, Université du Québec à Montréal

Sahu, D.K., Gogtay, N.J. and Bavdekar, S.B. (2005)
Effect of open access on citation ratesfor a small biomedical journal
Author eprint, December 1, 2005, inFifth International Congress on PeerReview and Biomedical Publication, Chicago, September 16-18, 2005
"We assessed the influence of OA on citations rates for a small,multi-disciplinary journal which adopted OA without article submission orarticle access fee. DESIGN The full text of articles published since 1990 weremade available online in 2001. Citations for these articles as retrieved usingWeb of Science, SCOPUS, and Google Scholar were divided into two groups - thepre-OA period (1990-2000) and the post-OA period (2001-2004). CONCLUSIONS Openaccess was associated with increase in the number of citations received by thearticles. It also decreased the lag time between publication and the firstcitation. For smaller biomedical journals, OA could be one of the means forimproving visibility and thus citation rates."

Zhao, D. (2005)
Challenges of scholarlypublications on the Web to the evaluation of science -- A comparison of authorvisibility on the Web and in print journals (abstract only)
Information Processing and Management, 41:6, 1403-1418, December 2005
Compares author visibility between the Web and print journals as revealed fromcitation analysis based on a search for the term "XML" or "eXtensible MarkupLanguage" using NEC Research Institute's CiteSeer, the entire ISI ScienceCitation Index (SCI) database, and journals indexed and classified in SCI asrepresenting computer science research. The main finding: "The author rankingby number of citations that resulted from CiteSeer data is highly correlatedwith that obtained from SCI." i.e. it's not comparing OA impact vs non-OA butWeb vs journal, and finds that authors, notably the top authors, areself-archiving and publishing papers in both places.

Coats, A. J. S. (2005)
Top of the charts:download versus citations in the International Journal of Cardiology(full-text requires subscription; otherwise abstract only)
International Journal of Cardiology, Volume 105, Issue 2, 2 November2005, 123-125, available online 7 October 2005
From the abstract: "We have recorded the 10 top cited articles over a 12-monthperiod and compared them to the 10 most popular articles being downloaded overthe same time period. The citation-based listing included basic and applied,observational and interventional original research reports. For downloadedarticles, which have shown a dramatic increase for the International Journal ofCardiology from 48,000 in 2002 to 120,000 in 2003 to 200,000 in 2004, the mostpopular articles over the same period are very different and are dominated byup-to-date reviews of either cutting-edge topics (such as the potential of stemcells) or of the management of rare or unusual conditions. There is no overlapbetween the two lists despite covering exactly the same 12-month period andusing measures of peer esteem. Perhaps the time has come to look at the usageof articles rather than, or in addition to, their referencing."

Adams, J. (2005)
Early citation countscorrelate with accumulated impact (abstract only)
Scientometrics, 63 (3): 567-581, June 2005
Working towards earlier prediction of impact. This paper is not OA and has justappeared but was written beforeBrody et al.(2005) revealed a correlation to predict impact from even earlier data,i.e. download data for OA papers, before any citations.

Moed, H. F. (2005)
StatisticalRelationships Between Downloads and Citations at the Level of IndividualDocuments Within a Single Journal (abstract only)
Journal of the American Society for Information Science and Technology,56(10): 1088-1097, published online 31 May 2005
"Statistical relationships between downloads from ScienceDirect of documents inElsevier's electronic journalTetrahedron Letters and citations to thesedocuments recorded in journals processed by the (ISI) for the Science CitationIndex (SCI) are examined. ... Findings suggest that initial downloads andcitations relate to distinct phases in the process of collecting and processingrelevant scientific information that eventually leads to the publication of ajournal article." Does not investigate open access sources. Notes the need forcaution in drawing conclusions on the frequency of paper downloads from formalcitation patterns, and vice versa.

Vaughan, L. and Shaw, D. (2005)
Web citation data for impactassessment: A comparison of four science disciplines (abstract only)
Journal of the American Society for Information Science and Technology,Vol. 56, No. 10, 1075 - 1087, published online 27 May 2005
appears to be an expansion ofCan Web Citationsbe a Measure of Impact? An Investigation of Journals in the Life Sciences(abstract only)
ASIST 2004: Proceedings of the 67th ASIS&T Annual Meeting, Vol. 41(Medford, USA: Information Today), pp. 516-526

Brody, T., Harnad, S. and Carr, L. (2005)
Earlier Web UsageStatistics as Predictors of Later Citation Impact
Author eprint, 18 May 2005, University of Southampton, School of Electronicsand Computer Science,Journal of the American Association for InformationScience and Technology, Volume 57, Issue 8, 2006, 1060-1072 (abstract)

Wren,J. D. (2005)
Open accessand openly accessible: a study of scientific publications shared via theinternet
BMJ, 330:1128, 12 April 2005Scintilla





Wren's article also prompted this editorial
Suber, P. (2005)
Openaccess, impact, and demand
BMJ, 330:1097-1098, 14 May 2005

Belew, R. (2005)
Scientific impact quantity andquality: Analysis of two sources of bibliographic data (pdf 12pp)
Arxiv.org, cs.IR/0504036, 11 April 2005

DeGroote, S. L., Shultz, M. and Doranski, M. (2005)
Onlinejournals' impact on the citation patterns of medical facultyJ Med Libr Assoc., 93 (2): 223-228, April2005
From the conclusion: "It is possible that electronic access to information(i.e., online databases) has had a positive impact on the number of articlesfaculty will cite. Results of this study suggest, at this point, that facultyare still accessing the print-only collection, at least for research purposes,and are therefore not sacrificing quality for convenience."

Metcalfe, T. S. (2005)
The Rise and Citation Impact ofastro-ph in Major Journals
ArXiv, Astrophysics, astro-ph/0503519, 23 March 2005
"I describe a simple method to determine the adoption rate and citation impactof astro-ph over time for any journal using NASA's Astrophysics Data System(ADS). I use the ADS to document the rise in the adoption of astro-ph for threemajor astronomy journals, and to conduct a broad survey of the citation impactof astro-ph in 13 different journals. I find that the factor of two boost incitations for astro-ph papers is a common feature across most of the majorastronomy journals."

Ongoing studies Hajjem, C. (2004-05)
Cover page for therange of studies highlighted below, Laboratoire de recherche en SciencesCognitives, UQAM. (Text in French but graphs "self-explanatory"; see thiscommentfor elaboration)

Bollen, J., Van de Sompel, H., Smith, J. and Luce, R. (2005)
Toward alternative metrics ofjournal impact: A comparison of download and citation data (pdf 34pp)
Arxiv.org, cs.DL/0503007, 03 March 2005, inInformation Processing andManagement, 41(6): 1419-1440, December 2005

Ongoing study Brody, T.,et al.
Citation Impact of Open AccessArticles vs. Articles Available Only Through Subscription ("Toll-Access")
with downloadable graphs of '% Articles OA' and '% OA Advantage' by disciplineand sub-discipline

Schwarz, G. and Kennicutt Jr., R. C. (2004)
Demographic and Citation Trendsin Astrophysical Journal Papers and Preprints (pdf 14pp)
Arxiv.org, astro-ph/0411275, 10 November 2004,Bulletin of the AmericanAstronomical Society, Vol. 36, 1654-1663
See also a note from AAS Pub Board meeting, Tucson, November 3-4 2003
"Greg Schwarz (from the ApJ editorial office) reported some work he's doingtracking citation rates of papers published in the ApJ based on whether theywere posted on astro-ph or not: ApJ papers that were also on astro-ph have acitation rate that is _twice_ that of papers not on the preprint server"
http://listserv.nd.edu/cgi-bin/wa?A2=ind0311&L=pamnet&D=1&O=D&P=1632

Havemann, F. (2004)
Eprints in derwissenschaftlichen kommunikation (Eprints in scientific communication)
Author eprint, 26 October 2004, presented at the Institute of Library Science,Humboldt University, Berlin, June 1, 2004
"the use of eprints can significantly accelerate the scientific communication.This was demonstrated by me with a small sample of articles in theoretical HighEnergy Physics published 1998 and 1999 in Physical Review D. Typically theeprints in this sample are available eight months before the printed issue ispublished. Three quarters of them are cited in eprints authored by otherresearchers before the journal issue appears (among them all highly citedeprints)."

Brody, T. (2004)
Citation Analysis in the OpenAccess World
Author eprint, October 4, 2004, inInteractive Media International

McVeigh, M. E. (2004)
OpenAccess Journals in the ISI Citation Databases: Analysis of Impact Factors andCitation Patterns
Thomson Scientific, October 2004

Antelman, K. (2004)
DoOpen-Access Articles Have a Greater Research Impact?
College and Research Libraries, 65(5):372-382, September 2004
also Author eprint, E-LIS, 29 September 2004,http://eprints.rclis.org/archive/00002309/


Harnad, S., Brody, T., Vallieres, F., Carr, L.,Hitchcock, S., Gingras, Y., Oppenheim, C., Stamerjohanns, H. and Hilf, E.(2004)
The Access/Impact Problemand the Green and Gold Roads to Open Access
Author eprint, 15 September 2004, inSerials Review, Vol. 30, No. 4,310-314 (free access topublishedversion during 2005)
Shorter version:The green andthe gold roads to Open Access
Nature, Web Focus: access to the literature, May 17, 2004

Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant,C. S., Demleitner, M., Murray, S. S. (2004b)
The Effectof Use and Access on Citations
Author eprint, September 2004, inInformation Processing and Management,41 (6): 1395-1402, December 2005




Perneger, T. V. (2004)
Relationbetween online "hit counts" and subsequent citations: prospective study ofresearch papers in the BMJ
BMJ, 329:546-547, 4 September 2004Scintilla



Prakasan, E. R. and Kalyane, V. L. (2004)
Citation analysis of LANLHigh-Energy Physics E-Prints through Science Citation Index (1991-2002)
Author eprint, E-LIS, 26 August 2004

Murali, N. S., Murali, H. R., Auethavekiat, P., Erwin, P. J., Mandrekar, J. N.,Manek, N. J. and Ghosh, A. K. (2004)
Impactof FUTON and NAA Bias on Visibility of Research
Mayo Clinic Proceedings, Vol. 79, No. 8, 1001-1006, August 2004
Notes and comment: FUTON = full text on the Net; NAA = no abstract available
This is not an article on how Open Access increases impact but on how *Online*Access increases impact. The effects are related, but one is a licensingeffect, not an OA effect.

Davis,P. M. (2004)
For ElectronicJournals, Total Downloads Can Predict Number of Users
portal: Libraries and the Academy, Vol. 4, No. 3, July 2004, 379-392

Harnad, S. and Brody, T. (2004a)
Comparingthe Impact of Open Access (OA) vs. Non-OA Articles in the Same Journals
D-Lib Magazine, Vol. 10 No. 6, June 2004
Replicates the Lawrence effect -- OA increases impact -- in physics.

Pringle, J. (2004)
Do OpenAccess Journals have Impact?
Nature, Web Focus: access to the literature, May 7, 2004

Testa, J. and McVeigh, M. E. (2004)
TheImpact of Open Access Journals: A Citation Study from Thomson ISI (pdf17pp)
Author eprint, 14 April 2004

Kurtz, M. J. (2004)
Restrictive accesspolicies cut readership of electronic research journal articles by a factor oftwo (pdf 2pp)
Harvard-Smithsonian Centre for Astrophysics, Cambridge, MA
Poster presentation atNational Policies on Open Access (OA) Provision forUniversity Research Output: an International meeting, Southampton, 19February 2004

Brody, T., Stamerjohanns, H., Harnad, S., Gingras, Y.and Oppenheim, C. (2004)
The Effect ofOpen Access on Citation Impact (pdf 1pp)
Poster presentation atNational Policies on Open Access (OA) Provision forUniversity Research Output: an International meeting, Southampton, 19February 2004

Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant, C.S., Demleitner, M. and Murray, S. S. (2004a)
WorldwideUse and Impact of the Nasa Astrophysics Data System Digital Library
Author eprint, January 28, 2004, inJournal of the American Society forInformation Science and Technology, Vol. 56, No. 1, 36-45, published online20 September 2004

Hitchcock, S., Brody, T., Gutteridge, C., Carr, L. and Harnad, S. (2003b)
The Impact ofOAI-based Search on Access to Research Journal Papers
Author eprint, 15 September 2003, inSerials, Vol. 16, No. 3, November2003, 255-260

Hitchcock, S., Woukeu, A., Brody, T., Carr, L.,Hall, W. and Harnad, S. (2003a)
EvaluatingCitebase, an open access Web-based citation-ranked search and impact discoveryservice
Technical Report ECSTR-IAM03-005, School of Electronics and Computer Science,University of Southampton, July 2003

Bollen, J., Vemulapalli, S. S., Xu, W. and Luce, R. (2003)
Usage Analysisfor the Identification of Research Trends in Digital Libraries
D-Lib Magazine, Vol. 9, No. 5, May 2003

Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant, C., Demleitner, M.,Murray, S. S., Martimbeau, N. and Elwell, B. (2003b)
The NASAAstrophysics Data System: Sociology, Bibliometrics, and Impact
Author eprint, March 2003,Journal of the American Society for InformationScience and Technology, submitted for publication

Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant,C. S., Demleitner, M., Murray, S. S., Martimbeau, N. and Elwell, B. (2003a)
TheBibliometric Properties of Article Readership Information
Author eprint, March 2003, inJournal of the American Society forInformation Science and Technology, 56 (2): 111-128, January 15, 2005

Brown, C. (2003)
Therole of electronic preprints in chemical communication: Analysis ofcitation, usage, and acceptance in the journal literature
Journal of the American Society for Information Science,Vol. 54, No. 5, 362-371, published online 6 Feb 2003
This study characterizes the usage and acceptance of electronic preprints(e-prints) in the literature of chemistry. Survey of authors ofe-prints appearing in the Chemistry Preprint Server (CPS) athttp://preprints.chemweb.com indicates use of the CPS as a convenientvehicle for dissemination of research findings and for receipt offeedback before submitting to a peer-reviewed journal. Reception of CPSe-prints by editors of top chemistry journals is very poor. Only 6% ofeditors responding allow publication of articles that have previouslyappeared as e-prints. Consequently, it was not surprising to discoverthat citation analysis yielded no citations to CPS e-prints in thetraditional literature of chemistry.

Drenth, J. P. H. (2003)
More reprint requests, morecitations? (subscriber access to full text)
Scientometrics, Vol. 56, No. 2, February 2003, 283-286, revised versionpublished online August 2006
From the abstract: "This study aims to correlate the number of reprint requestsfrom a 10-year-sample of articles with the number of citations. ... Articlesthat received most reprint requests are cited more often."

Darmoni, S. J.,et al. (2002)
Readingfactor: a new bibliometric criterion for managing digital libraries
Journal of the Medical Library Association, Vol. 90, No. 3, July 2002

Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant, C. S., Thompson, D. M.,Bohlen, E. H. and Murray, S. S. (2002)
The NASAAstrophysics Data System: Obsolescence of Reads and Cites (pdf 8pp)
Library and Information Services in Astronomy IV, edited by B. Corbin,E. Bryson, and M. Wolf, July 2002

Bollen, J. and Luce, R. (2002)
Evaluation ofDigital Library Impact and User Communities by Analysis of Usage Patterns
D-Lib Magazine, Vol. 8, No. 6, June 2002

Curti, M., Pistotti, V., Gabutti, G. and Klersy, C. (2001)
Impactfactor and electronic versions of biomedical scientific journals
Haematologica, Vol. 86, No. 10, Oct 2001, 1015-1020
From the Abstract: The availability of journals (table of contents (TOC), abstracts, full text and free full text) on Internet, in years 1995-2000, was assessed between December 2000 and January 2001. The first 20 top-journals from 8 subject categories were included. Changes in impact factor over time and association with Internet availability were modeled. RESULTS: Overall, 118/139 journals (85%) had their TOC on the Internet, of these 107 (77%) had abstracts, 97 (70%) had full text and 33 (24%) free full text. The median impact factor for all journals was 1.65, 2.08, 2.10, 2.21 and 2.35 for the years from 1995 to 1999, respectively. This increase was statistically significant, with differences among subject categories. The presence of TOC, abstracts and full text on the Internet was also significantly associated with higher impact factor, after accounting for time and subject category. INTERPRETATION AND CONCLUSIONS: The impact factor has been used for assessing the quality of journals. We identified a new limitation of this indicator: the impact factor seems to be related to the amount of circulation of information through Internet. This could be a temporary limitation, associated with diffusion of journals on, and spread of Internet.

Lawrence, S. (2001)
Freeonline availability substantially increases a paper's impact
Nature, 31 May 2001
see alsoOnline orinvisible, an extended version of theNature article self-archivedby the author
The first major findings on the impact effect documented in this bibliography,it remains the most cited paper in the bibliography. Note, its results concernonline access rather than open access. At that time the focus was on thetransition from print to electronic publication, with Lawrence quantifying theimproved access that resulted based on a sample of c.120k computer sciencepapers from 1,494 'venues'.

Anderson, K., Sack, J., Krauss, L. and O'Keefe, L. (2001)
PublishingOnline-Only Peer-Reviewed Biomedical Literature: Three Years of Citation,Author Perception, and Usage Experience
Journal of Electronic Publishing, Vol. 6, No. 3, March 2001
One of the first studies of the citation effect ofonline againstoffline publication, rather than of open access against non-OA. Providesdata for one journal and a small number of articles over a three year periodyear. This paper was added to the bibliography following this correspondence:

Brown, C. (2000)
TheE-volution of preprints in the scholarly communication of physicistsand astronomers
Journal of the AmericanSociety for Information Science, Vol. 52 No. 3, 187-200,published online 30 Nov 2000
From the Abstract: To learn how e-prints are cited, used, and accepted inthe literature of physics and astronomy, the philosophies, policies,and practices of top-tier physics and astronomy journals regardinge-prints from the Los Alamos e-print archive, arXiv.org, were examined.Citation analysis illustrated e-prints were cited with increasingfrequency by a variety of journals in a wide range of physics andastronomy fields from 1998 to 1999. Even though the policies concerninge-print citation and publication were inconsistent, the number ofcitations (35,928) and citations rates (34.1%) to 12 arXiv.org archiveswere found to be large and increasing.

Odlyzko, A. M. (2000)
The rapid evolution ofscholarly communication
PEAK 2000: Economics and Usage of Digital Library Collectionsconference, Ann Arbor, MI, March 2000.
Also inLearned Publishing, 15(1), 7-19, January 2002here.Author eprinthttp://www.dtc.umn.edu/~odlyzko/doc/rapid.evolution.pdf
Notes the growing usage of information in electronic form (c.f. print forms)and of journal papers from non-journal sites (e.g. eprints), and presentsevidence that usage increases when access is more convenient

Youngen, G. K. (1998)
CitationPatterns to Electronic Preprints in the Astronomy and AstrophysicsLiterature
Library and Information Services in Astronomy III, ASP ConferenceSeries, Vol. 153, 1998
see also
CitationPatterns to Traditional and Electronic Preprints in the PublishedLiterature
College & Research Libraries, September 1998

Youngen, G. (1998)
Citation PatternsOf The Physics Preprint Literature With Special Emphasis On The PreprintsAvailable Electronically
Author eprint, UIUC Physics and Astronomy library, c. 5 November 1998,presented at ACRL/STS on 6/29/97

Web tools for measuring impact

Citebase Search "Search and citation analysis tool for the free, onlineresearch literature"http://citebase.eprints.org/
see
Jacsó,P. (2004)CiteBaseSearch,Online, Sep/Oct 2004,57-58
Brody, T. (2003)CitebaseSearch: Autonomous Citation Database for e-print Archives,sinn03conference on Worldwide Coherent Workforce, Satisfied Users - New Services ForScientific Information, Oldenburg, Germany, September 2003
Hitchcock, S.,et al. (2003a)EvaluatingCitebase, an open access Web-based citation-ranked search and impact discoveryservice
Correlation Generatorhttp://citebase.eprints.org/analysis/correlation.php
Generates a graph (or table) of the correlation between citation impact and usage impact from the Citebase database
seeBrody, T. and Harnad, S. 2005 (inprep.)
Citeseer "Scientific literature digital library"http://citeseer.ist.psu.edu/
Nowavailable asCiteSeerx, or "Next Generation CiteSeer"http://citeseerx.ist.psu.edu
see
Jacsó, P., (2005)CiteSeer,Thomson Gale, November 2005
Lawrence, S., Giles, C. L., Bollacker, K. (1999),DigitalLibraries and Autonomous Citation Indexing,IEEE Computer, Vol. 32,No. 6, 67-71, 1999

Elsevier ScopusBibliographic database covering 13,450 peer-reviewedtitleshttp://www.scopus.com/
see
Jacsó,P. (2007)Scopus(2008 Winter Release), Gale, Reference Reviews, Péter's Digital ReferenceShelf, November 2007
Burnham, J. F. (2006)Scopusdatabase: a review,Biomedical Digital Libraries, 3:1, 8 March 2006
Dess,H. M. (2006)Scopus,Issues inScience and Technology Librarianship, Winter 2006
Quint, B.(2006)Elsevier's ScopusIntroduces Citation Tracker: Challenge to Thomson ISI's Web of Science?,Newsbreaks, January 23, 2006
Goodman, D. and Deis, L. (2006)Updateon Scopus,The Charleston Advisor, Vol. 7, No. 3, January 2006,42-43
Jacsó, P. (2004)Scopus,Thomson Gale, September 2004
see alsoComparative reviews

Google ScholarFind articles from academic publishers, preprintrepositories and universities, as well as scholarly articles across the web(presents citations as separate results)http://scholar.google.com/
see
Harzing, A. W.K., van der Wal, R. (2008)Google Scholar as anew source for citation analysis, Ethicsin Science and Environmental Politics, Vol. 8, No. 1, June 03, 2008,61-73
Jacsó,P. (2008)Thepros and cons of computing the h-index using Google Scholar.Online Information Review, 32(3) 2008,437-452
Meier, J.J. and Conkling, T. W. (2008)Google Scholar's Coverageof the Engineering Literature: An Empirical Study,Journal of AcademicLibrarianship, Vol. 34, No. 3, May 2008, 196-201 (full text requiressubscription; abstract only)
Jacsó,P. (2008)GoogleScholar,Online, Mar/Apr 2008,53-54
Harzing, A.-W. (2007)Reflectionson Google Scholar, Harzing.com, fifth version, 6 September 2007
about the citation analysis software Publish or Perish and its relation withGoogle Scholar
Quint, B. (2007)Changes atGoogle Scholar: A Conversation With Anurag Acharya,NewsBreaks,August 27, 2007
rare public interview with the low-profile 'designer and missionary' behindGoogle Scholar
Mayr,P. and Walter, A.-K. (2007)Anexploratory study of Google Scholar, arXiv.org > cs >arXiv:0707.3575v1 [cs.DL], July 24, 2007, inOnline Information Review,Vol. 31, No. 6 (2007), 814-830. Author preprint also available fromhttp://www.ib.hu-berlin.de/~mayr/arbeiten/OIR-Mayr-Walter-2007.pdf
Robinson,M. L. and Wusteman, J. (2007)Putting GoogleScholar to the test: a preliminary study, author eprint, also inProgram: Electronic Library and Information Systems, Vol. 41, Issue 1,February 2007, 71-80
Sadeh, T.(2006)Google ScholarVersus Metasearch Systems,HEP Libraries Webzine, issue 12, March2006
"thoughtful and informative ... altogether the best overview of Google Scholar,other large federated search systems such as Scirus, and library-basedmetasearch tools I've seen." Reviewed by Tennant, R.,Current Cites,January 2006issue
Burright, M. (2006)Google Scholar -- Science& Technology,Issues in Science and Technology Librarianship,Winter 2006
Noruzi, A. (2005)GoogleScholar: the new generation of citation indexes (pdf 11pp), E-LIS, 11February 2006, inLIBRI 55(4): 170-180
Jacsó, P., (2005)GoogleScholar and The Scientist, commenting on his interview in Perkel, J.,The Future of CitationAnalysis (abstract only),The Scientist, October 24, 2005
Jacsó, P. (2005)GoogleScholar (Redux), Thomson Gale, June 2005
Myhill, M. (2005)Google Scholar,Charleston Advisor, Vol. 6, No. 4, April 2005
Giustini, D. and Barsky, E. (2005)A look at GoogleScholar, PubMed, and Scirus: comparisons and recommendations,Journal ofthe Canadian Health Libraries Association/Journal de l'Association desbibliothèques de la santé du Canada (JCHLA / JABSC) 26: 85-89 (2005) (pdf5pp)
Jacsó, P. (2004)GoogleScholar Beta, Thomson Gale, December 2004
see alsoComparative reviews

ISI Web of Science Cited reference searching of 8,700 high impactresearch journalshttp://www.isinet.com/products/citation/wos/
see
Jacsó,P. (2007)Web ofScience, Gale, Reference Reviews, Péter's Digital Reference Shelf, January2007
Jacsó, P. (2004)Web ofScience Citation Indexes, Thomson Gale, August 2004
see alsoComparative reviews

Rexa.Info Covers the computer science research literature. Rexa is "asibling to CiteSeer, Google Scholar, Academic.live.com, the ACM Portal. It'schief enhancement is that Rexa knows about more first-class, de-duplicated,cross-referenced object types: not only papers and their citation links, butalso people, grants, topics"http://rexa.info/

Windows Live Search Academic Beta version. Indexes content related tocomputer science, physics, electrical engineering, and related subject areas,with more than 6 million records from approximately 4300 journals, 2000conferences and ArXiv.org. In collaboration with Citeseerhttp://academic.live.com/
see
Nadella,S. (2008)Booksearch winding down, Live Search, The official blog of the Live Search teamat Microsoft, May 23, 2008
"Live Search Books and Live Search Academic projects ... will be taken downnext week. Books and scholarly publications will continue to be integrated intoour Search results, but not through separate indexes."
Jacsó,P. (2008)LiveSearch Academic, Gale, Reference Reviews, Péter's Digital Reference Shelf,April 2008
Jacsó,P. (2006)WindowsLive Academic,Online, Sep/Oct2006, 59-60
Quint, B.(2006)WindowsLive Academic Search: The Details,Newsbreaks, April 17, 2006
Sherman, C. (2006)MicrosoftLaunches Windows Live Academic Search, SearchEngineWatch.com, April 12,2006

Citations in Economics not intended for direct user access; instead ismade available to RePEc services such as Socionet, EconPapers and IDEAS. UsesCiteseer softwarehttp://citec.repec.org/
Rank working papers series and journals in Economicshttp://citec.repec.org/s/
see
Barrueco Cruz, J. M. and Krichel, T. (2004)Building an autonomouscitation index for grey literature: the economics working papers case (pdf12pp), E-LIS, 01 February 2005, also inProceedings GL6: Sixth InternationalConference on Grey Literature, New York, December 2004

CrossRef Forward linking service tool allows CrossRef memberpublishers to display cited-by links in their primary content,
CrossRef andAtypon announce forward linking service (press release) June 8, 2004
Institute of Physics becomes first journals publisher to implement 'cited-by'links using CrossRef's Forward Linking service:Time travel with IOP journals (IOP pressrelease) 14 March, 2005

ForthcomingISI Web Citation Index
see
Martello,A. (2006)Selectionof Content for the Web Citation Index: Institutional Repositories andSubject-Specific Archives, Thomson.com, undated
Pringle, J. (2005)Partneringhelps institutional repositories thrive,KnowledgeLink Newsletter,February 2005
Citeseer'sreplacement? List server mailing, 18 March 2004
Quint, B. (2004)Thomson ISIto Track Web-Based Scholarship with NEC's CiteSeer,Information TodayNewsbreaks, March 1, 2004

Norris, M., Oppenheim, C. and Rowland, F. (2008)
Finding open accessarticles using Google, Google Scholar, OAIster and OpenDOAR
Online Information Review, Vol. 32, No. 6, 2008, 709-715
also available from Loughborough University Institutional Repository,2009-01-12 http://hdl.handle.net/2134/4084
From the abstract: "Google, Google Scholar, OAIster and OpenDOAR were used totry to locate OA versions of peer reviewed journal articles drawn from threesubjects (ecology, economics, and sociology). The paper shows the relativeeffectiveness of the search tools in these three subjects. The results indicatethat those wanting to find OA articles in these subjects, for the moment atleast, should use the general search engines Google and Google Scholar firstrather than OpenDOAR or OAIster."

Jacsó, P. (2008)
ThePlausibility of Computing the H-index of Scholarly Productivity and ImpactUsing Reference Enhanced Databases
Online Information Review, 32(2) 2008,266-283
"aims to provide a general overview of the three largest,cited-reference-enhanced, multidisciplinary databases (Google Scholar, Scopus,and Web of Science) for determining the h-index. The practical aspects ofdetermining the h-index also need scrutiny, because some content and softwarecharacteristics of reference-enhanced databases can strongly influence theh-index values."

Meho,L. I. and Rogers, Y. (2008)Citation Counting, CitationRanking, and h-Index of Human-Computer Interaction Researchers: A Comparisonbetween Scopus and Web of Science, E-LIS, 10 March 2008, inJournal ofthe American Society for Information Science and Technology, 59 (11):1711-1726, September 2008

Kloda,L. A. (2007)UseGoogle Scholar, Scopus and Web of Science for Comprehensive CitationTracking,Evidence Based Library and Information Practice, 2(3):87-90, 2007, also in E-LIS, 21 September 2007http://eprints.rclis.org/archive/00011437/

Schroeder, R. (2007)PointingUsers Toward Citation Searching: Using Google Scholar and Web of Science,portal: Libraries and the Academy, Vol. 7, No. 2, April 2007, 243-248(full text requires subscription)

Goodman, D. and Deis, L. (2007)Updateon Scopus and Web of Science,The Charleston Advisor, Vol. 8, No. 3,January 2007, 15-18

Meho,L. I. and Yang, K. (2006)A New Erain Citation and Bibliometric Analyses: Web of Science, Scopus, and GoogleScholar, arXiv.org, Computer Science, cs/0612132, 23 Dec 2006, published asImpact of data sources on citation counts and rankings of LIS faculty: Web ofscience versus scopus and google scholar, inJournal of the American Societyfor Information Science and Technology, Vol. 58, No. 13, 2007, 2105-2125

Fingerman, S. (2006)Webof Science and Scopus: Current Features and Capabilities,Issues inScience and Technology Librarianship, Fall, 2006

Neuhaus, C. and Daniel, H.-D. (2006)Datasources for performing citation analysis: An overview, ETH E-Collection,June 30, 2006,Journal of Documentation, accepted for publication
Reports the limitations of Thomson Scientific's citation indexes and reviewsthe characteristics of the citation-enhanced databases Chemical Abstracts,Google Scholar and Scopus.

Bakkalbasi, N., Bauer, K., Glover, J. and Wang, L. (2006)Three options for citationtracking: Google Scholar, Scopus and Web of Science,Biomedical DigitalLibraries, June 29, 2006

Bosman, J., van Mourik, I., Rasch, M., Sieverts, E. and Verhoeff, H. (2006)Scopusreviewed and compared, Igitur repository, Utrecht University, June 2006
The coverage and functionality of the citation database Scopus, includingcomparisons with Web of Science and Google Scholar

Wenzel, E. (2006)GoogleScholar beta, ZDNet, May 2, 2006
Brief comparison of Google Scholar and Microsoft Live Academic Search

Bailey, C. W. Jr (2006)ASimple Search Hit Comparison for Google Scholar, OAIster, and Windows LiveAcademic Search, Digital Koans, author blog, April 13, 2006
A simple but revealing experiment: "It should be clear that a sample of onesearch term is a very crude measure".

Pauly, D. and Stergiou, K. I. (2005)Equivalence of resultsfrom two citation analyses: Thomson ISI's Citation Index and Google's Scholarservice (pdf 3pp),Ethics in Science and Environmental Politics, 22December 2005, 33-35

Jacsó, P. (2005)Comparisonand analysis of the citedness scores in Web of Science and Google Scholar(pdf 10pp),Digital Libraries: Implementing Strategies and SharingExperiences, Lecture Notes In Computer Science, 3815: 360-369, 2005,Proceedings of the 8th International Conference on Asian Digital Libraries,ICADL 2005, Bangkok, Thailand, December 12-15, 2005

Roth, D. L. (2005)The emergence ofcompetitors to the Science Citation Index and the Web of Science (pdf 6pp),Current Science Online, Vol. 89, No. 9, 10 November 2005

Jacsó, P. (2005)As we may search —Comparison of major features of the Web of Science, Scopus, and Google Scholarcitation-based and citation-enhanced databases (pdf 11pp),CurrentScience Online, Vol. 89, No. 9, 10 November 2005

Bauer, K. and Bakkalbasi, N. (2005)An Examinationof Citation Counts in a New Scholarly Communication Environment,D-LibMagazine, 11(9), September 2005.
Compares citation counts provided by Web of Science, Scopus, and GoogleScholar.



LaGuardia, C. (2005)Scopus vs. Webof Science,Library Journal, 130(1); 40, 42, January 15, 2005

Deis, L. F. and Goodman, D. (2005)Web of Science (2004 version) andScopus,Charleston Advisor, Vol. 6, No. 3, January 2005

Background

The financial imperative: correlating researchaccess, impact and assessment

There is another dimension to the open access advantage. If open accessincreases impact, then it will also increase research income and funding. Ithas been shown in the UK that there is a correlation between researchassessment ratings and citation counts, and higher ratings means more money forthe higher rated research groups. Of course, if all papers were made openaccess by their authors, the relative effect would disappear. First-moveradvantage anyone?

"Research impact translates into money: employment, salary, tenure money, aswell as research-funding money: (1) RAE rank correlates with substantialtop-sliced funding, (2) it also correlates highly (0.91) with citation counts,and (3) self-archiving increases citation counts by 50-250+%. Do you reallythink that any researcher who is *aware* of those three correlations is beingrational if he doesn't self-archive?" Stevan Harnad

Ulf Kronman (2013)
Managingyour assets in the publication economy
Confero: Essays on Education, Philosophy and Politics,1-35, 17 Jan 2013
info:doi/10.3384/confero13v1130117
Abstract: The issue this article aims to address is the fact thatpublications may nowadays be used to assess impact and quality ofresearch in ways academics may not be fully aware of. During recentyears, scholarly publications have gained in importance, not primarilyas the traditional vehicle for the dissemination of new scientificfindings, but as a foundation for assessing the production and impactof organizations, research groups and individual researchers. Thismeans that publications as artefacts per se are starting to play a newimportant role in the scientific community and that researchers need tobe aware of how publication and citation counts are being used toassess their research and the outreach, impact and reputation of theirmother organization. University rankings, for instance, often have someparameters based on the publishing of the ranked institution. Thisarticle is thus not about scientific writing as such; it focuses onwhat happens to your publication after the publishing has taken placeand on aspects to take into account while planning the publishing ofyour article, report or book.

Spörrle, M. and Tumasjan, A. (2011)
UsingSearch Engine Count Estimates as Indicators of Academic Impact: AWeb-based Replication of Haggbloom et al.s (2002) Study
The Open Psychology Journal, 4, 12-18, 13 July 2011
DOI: 10.2174/1874350101104010012
Abstract: Using a complex set of quantitative and qualitativeindicators of scientific importance, Haggbloom et al. compiled aranking of the most eminent psychologists of the 20th century. Thepresent study set out to replicate this rankordered list using simplesearch engine count estimates (SECEs) obtained from three popularinternet search engines. In line with our expectations, our resultsrevealed a small, but significant relationship between SECEs and theexisting offline ranking when the query specified the scientists fieldof research (i.e., psychology). Our results imply that SECEs may beconsidered easy to apply indicators of a researchers impact.

Li, R. (2011)
Correlationof Impact Measures of Institutional Repositories and PBRF Ranking
ResearchArchive @ Victoria, 18 May 2011
Master's thesis. From the Abstract: This study examines the correlationof website impact factor of institutional repositories (IR) of alleight universities in New Zealand and the Performance Based ResearchFund (PBRF) quality score. The research also studied the different webranking tools and tried to find out whether these tools can be used tomeasure the quality of IR documents. The research used Yahoo SiteExplorer to collect information of inlinks and also use other tools tocollect the webpage ranking. The finding of this research are thatthere is small correlation between the IR website impact factor andPBRF quality score, and the page ranking is not a good tool to exam thequality of IR document as a whole.

Suber, P. (2010)
Thinkingabout prestige, quality, and open access
SPARC Open Access Newsletter, issue #125, September 2, 2008
Briefextracts. Here are a dozen thoughts or theses about prestige and OA. Istart with the rough notion that if journal quality is real excellence,then journal prestige is reputed excellence. (1) Universities rewardfaculty who publish in high-prestige journals, and faculty are stronglymotivated to do so. If universities wanted to create thisincentive, they have succeeded. If journal prestige andjournalquality can diverge, then universities and funders may be givingauthors an incentive to aim only for prestige. If they wantedtocreate an incentive to put quality ahead of prestige, they haven't yetsucceeded. (8) Universities tend to use journal prestige and impact assurrogates for quality. The excuses for doing so are gettingthin.

Willinsky, J. (2010)
Openaccess and academic reputation
NISCAIR Online Periodicals Repository (NOPR), Annals of Library andInformation Studies, 57 (3), Sep 2010, 296-302
Abstract:Open access aims to make knowledge freely available to those who wouldmake use of it. High-profile open access journals, such as thosepublished by PLoS (Public Library of Science), have been able todemonstrate the viability of this model for increasing an authorsreach and reputation within scholarly communication through the use ofsuch bibliographic tools as the Journal Impact Factor, conceived anddeveloped by Eugene Garfield. This article considers the variousapproaches that authors, journals, and funding agencies are takingtoward open access, as well as its effect on reputation for authorsand, more widely, for journals and the research enterprise itself.

Li, J., Sanderson, M., Willett, P., Norris, M. andOppenheim, C. (2010)
Rankingof Library and Information Science Researchers: Comparison of DataSources for Correlating Citation Data and Expert Judgments
Journal of Informetrics, 16Jun 2010
Openaccess provides scope for new citation-based metrics, but these wouldhave to be tested and validated against current, preferred methods ofassessment. This paper is not focussed on open access, but it shows howsuch testing and validation might be performed.
From the Abstract: Thispaper studies the correlations between peer review and citationindicators when evaluating research quality in library and informationscience (LIS). Forty two LIS experts provided judgments on a five-pointscale of the quality of research published by 101 scholars; the medianrankings resulting from these judgments were then correlated with h-,g- and H-index values computed using three different sources ofcitation data: Web of Science (WoS), Scopus and Google Scholar (GS).

Aguillo, I. F., Ortega, J. L., Fernández, M. and Utrilla, A. M. (2010)
Indicatorsfor a webometric ranking of open access repositories
Scientometrics, Vol. 82, No. 3, March 2010, 477-486, publishedonline: 6 February 2010

Harnad, S., Carr, L., Swan, A., Sale, A. and Bosc, H. (2009)
Maximizingand Measuring Research Impact Through University and Research-FunderOpen-Access Self-Archiving Mandates
ECS EPrints, 08 Dec 2009, inWissenschaftsmanagement,15 (4), 36-41

Aguillo, I. (2009)
Measuringthe institution's footprint in the web
Library Hi Tech,Vol. 27, No. 4, 2009, 540-556
DOI: 10.1108/073788309

Allen, L., Jones, C., Dolby, K., Lynn, D. and Walport, M. (2009)
Lookingfor Landmarks: The Role of Expert Review and Bibliometric Analysis inEvaluating Scientific Publication Outputs
PLoS ONE, 4(6): e5910, June 18, 2009 doi:10.1371/journal.pone.0005910

Corbyn, Z. (2009)
Hefcebacks off citations in favour of peer review in REF
Times Higher Education, 18 June 2009

Houghton, J., Rasmussen, B., Sheehan, P., Oppenheim,C., Morris, A., Creaser, C., Greenwood, H., Summers, M. and Gourlay, A.(2009)
Economicimplications of alternative scholarly publishing models: Exploring the costsand benefits
JISC, 27 January 2009

Oppenheim, C. (2008)
Out with the old andin with the new: The RAE, bibliometrics and the new REF (first page pdf;full text requires subscription)
Journal of Librarianship and Information Science, 40 (3): 147-149,September 2008

Cho, S.-R. (2008)
New evaluation indexesfor articles and authors' academic achievements based on Open AccessResources (full text requires subscription; abstract only)
Scientometrics, Vol. 77, No. 1 (2008)91-112, published online: 24 July 2008

Oppenheim, C. and Summers, M. A. C.
Citation counts and theResearch Assessment Exercise, part VI: Unit of assessment 67 (music)
Information Research, 13 (2), paper 342, June 2008

Adler, R., Ewing, J. (Chair) and Taylor, P. (2008)
CitationStatistics (pdf 26pp)
Joint Committee on Quantitative Assessment of Research, InternationalMathematical Union, IMU-ICIAM-IMS, 6/11/2008


Harnad, S. (2008)
Validatingresearch performance metrics against peer rankings
Ethics in Science and EnvironmentalPolitics, Vol. 8, No. 1, June 03, 2008, 103-107

Taraborelli, D. (2008)
Soft peer review. Socialsoftware and distributed scientific evaluation (pdf 12pp)
InProceedings of the 8th InternationalConference on the Design of Cooperative Systems (COOP 08),Carry-Le-Rouet, France, May 20-23, 2008
From the abstract: "I analyze the contribution that social bookmarking systemscan provide to the problem of usage-based metrics for scientific evaluation. Isuggest that collaboratively aggregated metadata may help fill the gap betweentraditional citation-based criteria and raw usage factors."

Pringle, J. (2008)
Trends in the use of ISIcitation databases for evaluation
Learned Publishing, Vol. 21, No. 2, April 2008, 85-91
Abstract: "This paper explores the factors shaping the current uses of the ISIcitation databases in evaluation both of journals and of individual scholarsand their institutions. Given the intense focus on outcomes evaluation, in acontext of increasing 'democratization' of metrics in today's digital world, itis easy to lose focus on the appropriate ways to use these resources, andmisuse can result."

Armbruster, C. (2008)
Access,Usage and Citation Metrics: What Function for Digital Libraries andRepositories in Research Evaluation?
Social Science Research Network, February 05, 2008
From the abstract: "This systematic appraisal of the future role of digitallibraries and repositories for metric research evaluation proceeds byinvestigating the practical inadequacies of current metric evaluation beforedefining the scope for libraries and repositories as new players. Servicesreviewed include: Leiden Ranking, Webometrics Ranking of World Universities,COUNTER, MESUR, Harzing POP, CiteSeer, Citebase, RePEc LogEc and CitEc, Scopus,Web of Science and Google Scholar."

Surya,M., D'Este, P. and Neely, A. (2008)
Citation Counts:Are They Good Predictors of RAE Scores? A bibliometric analysis of RAE 2001
Cranfield QUEprints, 31.01.2008


Harnad, S. (2007)
Open Access Scientometrics and theUK Research Assessment Exercise
ArXiv, Computer Science, cs.IR/0703131, 26 March 2007. Preprint of invitedkeynote address to11th Annual Meeting of the International Society forScientometrics and Informetrics, Madrid, 25-27 June 2007
also in ECS EPrints, 29 March 2007http://eprints.ecs.soton.ac.uk/13804/
Latest version ECS EPrints, 27 Feb 2009http://eprints.ecs.soton.ac.uk/17142/,inScientometrics, 79 (1), 2009, 147-156, published online: 13 November2008, http://dx.doi.org/10.1007/s11192-009-0409-z

Steele, C., Butler, L. and Kingsley, D. (2006)
The publishing imperative:the pervasive influence of publication metrics
ANU Institutional Repository, 30 October 2006, also inLearnedPublishing, 19(4): 277-290, October 2006

Houghton J. and Sheehan, P. (2006)
The Economic Impact ofEnhanced Access to Research Findings
Centre for Strategic Economic Studies. Victoria University. July 2006
See also
Houghton, J., Steele, C. and Sheehan, P. (2006)
ResearchCommunication Costs In Australia: Emerging Opportunities And Benefits
Department of Education, Science and Training (DEST), Australia, September 2006


Shadbolt, N., Brody, T., Carr, L. and Harnad, S.(2006)
The Open Research Web: APreview of the Optimal and the Inevitable
ECS EPrints, 02 May 2006, inOpen Access: Key Strategic, Technical andEconomic Aspects, Jacobs, N., Ed., chapter 21 (Oxford: Chandos Publishing)

Harnad, S. (2005)
Maximising the Return onUK's Public Investment in Research
Author eprint, September 14, 2005
Attempts to monetise 'lost' impact: "The online-age practice of self-archivinghas been shown to increase citation impact by a dramatic 50-250%, but so faronly 15% of researchers are doing it spontaneously. Citation impact is rewardedby universities (through promotions and salary increases) and byresearch-funders like RCUK (through grant funding and renewal) at aconservative estimate of £46 per citation. ... As a proportion of the RCUK'syearly £3.5bn research expenditure (yielding 130,000 articles x 5.6 = 761,600citations), our conservative estimate would be 50% x 85% x £3.5.bn = £1.5bnworth of loss in potential research impact (323,680 potential citations lost)."
See also
Australia is notmaximising the return on its research investment (ETD2005, Sydney)for the same estimate applied to the potential lost return ($425M) there.



Day, M. (2004)
Institutionalrepositories and research assessment (pdf 29pp)
Author eprint (v. 0.1), 2 December 2004

Harnad, S. (2003)
Maximizinguniversity research impact through self-archiving
Jekyll.com, No. 7, December 2003

Harnad, S. (2003)
Enhance UK researchimpact and assessment by making the RAE webmetric
Author eprint, inTimes Higher Education Supplement, 6 June 2003, p. 16

Harnad, S., Carr, L., Brody, T. and Oppenheim, C. (2003)
Mandated online RAE CVslinked to university eprint archives: Enhancing UK research impact andassessment
Ariadne, issue 35, April 2003

Smith, A. and Eysenck, M. (2002)
The correlation betweenRAE ratings and citation counts in psychology (pdf 12pp)
Technical Report, Psychology, Royal Holloway College, University of London,June 2002

Holmes, A. and Oppenheim, C. (2001)
Use ofcitation analysis to predict the outcome of the 2001 Research AssessmentExercise for Unit of Assessment (UoA) 61: Library and InformationManagement
Information Research, Vol. 6, No. 2, January 2001

Harnad, S. (2001)
ResearchAccess, Impact and Assessment (longer version)
Author eprint, inTimes Higher Education Supplement, 1487: p. 16., 2001

Garfield, E. (1988)
CanResearchers Bank on Citation Analysis? (pdf 10pp)
Current Comments, No. 44, October 31, 1988
attached (pp 3-10)
Diamond, Jr., A. M. (1986)
What is a Citation Worth?
J. Hum. Resour., 21:200-15, 1986
Garfield comments on studies that attempt to quantify the reward system ofscience in terms of monetary returns to author salaries from articlepublication and citations, reprinting one of those studies

Citation analysis, indexes and impact factors

Notes. Important work in this area builds on Eugene Garfield'spioneering work in the 1950s. Although producing some remarkably successfultools for measuring the impact of the scholarly literature, this area is notwithout controversy. This short list presents a cross-section underlining theseissues, with a view to understanding how such long-established approaches mightadapt to online data, and how possible shortcomings might be overcome.

Mike Thelwall, Stefanie Haustein, Vincent Larivière, Cassidy R.Sugimoto (2013)
Doaltmetrics work? Twitter and ten other social web services
Author preprint.PLoSONE, 8(5): e64841, May 28, 2013 http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0064841
doi:10.1371/journal.pone.0064841
Abstract: Altmetric measurements derived from the social web are increasinglyadvocated and used as early indicators of article impact andusefulness. Nevertheless, there is a lack of systematic scientificevidence that altmetrics are valid proxies of either impact or utilityalthough a few case studies have reported medium correlations betweenspecific altmetrics and citation rates for individual journals orfields. To fill this gap, this study compares 11 altmetrics with Web ofScience citations for 76 to 208,739 PubMed articles with at least onealtmetric mention in each case and up to 1,891 journals per metric. Italso introduces a simple sign test to overcome biases caused bydifferent citation and usage windows. Statistically significantassociations were found between higher metric scores and highercitations for articles with positive altmetric scores in all cases withsufficient evidence (Twitter, Facebook wall posts, research highlights,blogs, mainstream media and forums) except perhaps for Google+ posts.Evidence was insufficient for LinkedIn, Pinterest, question and answersites, and Reddit, and no conclusions should be drawn about articleswith zero altmetric scores or the strength of any correlation betweenaltmetrics and citations. Nevertheless, comparisons between citationsand metric values for articles published at different times, evenwithin the same year, can remove or reverse this association and sopublishers and scientometricians should consider the effect of timewhen using altmetrics to rank articles. Finally, the coverage of allthe altmetrics except for Twitter seems to be low and so it is notclear if they are prevalent enough to be useful in practice.

Stacy Konkiel and Dave Scherer (2013)
NewOpportunities for Repositories in the Age of Altmetrics
Bulletin of theAssociation for Information Science and Technology, Vol.39, No. 4, April/May 2013
University administrators are increasingly trying to find new ways to measure theimpact of the scholarly output of their faculty, students andresearchers through quantitative means. By reporting altmetrics(alternative metrics based on online activity) for their content,institutional repositories can add value to existing metrics  andprove their relevance and importance in an age of growing cutbacks tolibrary services. This article will discuss the metrics thatrepositories currently deliver and how altmetrics can supplementexisting usage statistics to provide a broader interpretation ofresearch-output impact for the benefit of authors, library-basedpublishers and repository managers, and university administratorsalike.

Ross Mounce (2013)
OpenAccess and Altmetrics: Distinct but Complementary
Bulletin of theAssociation for Information Science and Technology, Vol.39, No. 4, April/May 2013
Extract: Alongside (the) growth and preference for online journals, there hasbeen a notable rise in the growth and popularity of a particular typeof online journal - open access (OA) journals, which expressly allowanyone on the Internet to read them for free without paying. Suchjournals make it even easier for people to discover, access and re-usejournal literature. With this change in the consumption pattern ofjournal content to online, new ideas such as altmetrics have arisen tohelp us better assess the influence and impact of online journalarticles. This article considers the complementary relationship betweenOA journal publishing and altmetrics, scholarly impact measures derivedfrom online activity, as a means of capturing and measuring some of theinfluence of online journal articles.

Greg Tananbaum (2013)
Article-LevelMetrics: a SPARC Primer
SPARC, April 2013
From the Executive Summary: Article-Level Metrics (ALMs) can beemployed in conjunction withexisting metrics, which havetraditionally focused on thelong-term impact of acollection of articles (i.e.,a journal) based onthe number of citationsgenerated. This primer isdesigned to give campusleaders and other interestedparties an overview ofwhat ALMs are, whythey matter, how theycomplement established utilities,and how they can beused in the tenureand promotion process.

Petr Heneberg (2013)
Effectsof Print Publication Lag in Dual Format Journals on ScientometricIndicators
PLoS ONE,8(4): e59877, April 3, 2013
doi:10.1371/journal.pone.0059877
From the Abstract: Dual-format peer-reviewed journals (publishing bothprint and online editions of their content) adopted a broadly acceptedstrategy to shorten the publication lag: to publish the acceptedmanuscripts online ahead of their print editions, which may followdays, but also years later. Effects of this widespread habit on theimmediacy index (average number of times an article is cited in theyear it is published) calculation were never analyzed.Methodology/Principal Findings: Scopus database (which contains nearlyup-to-date documents in press, but does not reveal citations by thesedocuments until they are finalized) was searched for the journals withthe highest total counts of articles in press, or highest counts ofarticles in press appearing online in 20102011. Number of citationsreceived by the articles in press available online was found to benearly equal to citations received within the year when the documentwas assigned to a journal issue. Thus, online publication of in pressarticles affects severely the calculation of immediacy index of theirsource titles, and disadvantages online-only and print-only journalswhen evaluating them according to the immediacy index and probably alsoaccording to the impact factor and similar measures.Conclusions/Significance: Caution should be taken when evaluatingdual-format journals supporting long publication lag.

Ehsan Mohammadi, Mike Thelwall (2013)
Assessingnon-standard article impact using F1000 labels
Scientometrics,published online 20 March 2013
DOI: 10.1007/s11192-013-0993-9
Abstract:Faculty of 1000 (F1000) is a post-publishing peer review web site whereexperts evaluate and rate biomedical publications. F1000 reviewers alsoassign labels to each paper from a standard list or article types. Thisresearch examines the relationship between article types, citationcounts and F1000 article factors (FFa). For this purpose, a randomsample of F1000 medical articles from the years 2007 and 2008 werestudied. In seven out of the nine cases, there were no significantdifferences between the article types in terms of citation counts andFFa scores. Nevertheless, citation counts and FFa scores weresignificantly different for two article types: New finding andChanges clinical practice: FFa scores value the appropriateness ofmedical research for clinical practice and New finding articles aremore highly cited. It seems that highlighting key features of medicalarticles alongside ratings by Faculty members of F1000 could help toreveal the hidden value of some medical papers.

Ludo Waltman, Rodrigo Costas (2013)
F1000recommendations as a new data source for research evaluation: Acomparison with citations
arXiv.org > cs > arXiv:1303.3875, 15 March 2013
Abstract:F1000 is a post-publication peer review service for biological andmedical research. F1000 aims to recommend important publications in thebiomedical literature, and from this perspective F1000 could be aninteresting tool for research evaluation. By linking the completedatabase of F1000 recommendations to the Web of Science bibliographicdatabase, we are able to make a comprehensive comparison between F1000recommendations and citations. We find that about 2% of thepublications in the biomedical literature receive at least one F1000recommendation. Recommended publications on average receive 1.30recommendations, and over 90% of the recommendations are given withinhalf a year after a publication has appeared. There turns out to be aclear correlation between F1000 recommendations and citations. However,the correlation is relatively weak, at least weaker than thecorrelation between journal impact and citations. More research isneeded to identify the main reasons for differences betweenrecommendations and citations in assessing the impact of publications.

Adriano Tort, Ze Targino, and Olavo Amaral (2012)
RisingPublication Delays Inflate Journal Impact Factors
PLoS ONE,7(12): e53374, 31 Dec 2012
info:doi/10.1371/journal.pone.0053374
From the Abstract: We analyze 61 neuroscience journals and show thatdelays between online and print publication of articles increasedsteadily over the last decade. Importantly, such a practice varieswidely among journals, as some of them have no delays, while for othersthis period is longer than a year. Using a modified impact factor basedon online rather than print publication dates, we demonstrate thatonline-to-print delays can artificially raise a journals impactfactor, and that this inflation is greater for longer publication lags.We also show that correcting the effect of publication delay on impactfactors changes journal rankings based on this metric. We thus suggestthat indexing of articles in citation databases and calculation ofcitation metrics should be based on the date of an articles onlineappearance, rather than on that of its publication in print.

Emilio Lopez-Cozar, Nicolas Robinson-Garcia, and Daniel Torres-Salinas(2012)
ManipulatingGoogle Scholar Citations and Google Scholar Metrics: simple, easy andtempting
arXiv.org > cs > arXiv:1212.0638, 04 Dec 2012
From the Abstract: The launch of Google Scholar Citations and GoogleScholar Metrics may provoke a revolution in the research evaluationfield as it places within every researchers reach tools that allowbibliometric measuring. In order to alert the research community overhow easily one can manipulate the data and bibliometric indicatorsoffered by Google's products we present an experiment in which wemanipulate the Google Citations profiles of a research group throughthe creation of false documents that cite their documents, andconsequently, the journals in which they have published modifying theirH index.

Xuemei Li and Mike Thelwall (2012)
F1000,Mendeley and Traditional Bibliometric Indicators
In17th InternationalConference on Science and Technology Indicators (STI),Montreal, 5-8 September 2012
Abstract:This article compares the Faculty of 1000 (F1000) quality filteringresults and Mendeley usage data with traditional bibliometricindicators, using a sample of 1397 Genomics and Genetics articlespublished in 2008 selected by F1000 Faculty Members (FMs). BothMendeley user counts and F1000 article factors (FFas) correlatesignificantly with citation counts and associated Journal ImpactFactors. However, the correlations for Mendeley user counts are muchlarger than those for FFas. It may be that F1000 is good at disclosingthe merit of an article from an expert practitioner point of view whileMendeley user counts may be more closely related to traditionalcitation impact. Articles that attract exceptionally many citations aregenerally disorder or disease related, while those with extremely highsocial bookmark user counts are mainly historical or introductory.

Daniel Acuna, Stefano Allesina, and Konrad Kording (2012)
Futureimpact: Predicting scientific success
Nature, 489(7415), 201-2, 13 September 2012
info:doi/10.1038/489201a
Presents a formula to estimate the future h-index of life scientists.The h-index and similar metrics can capture only past accomplishments,not future achievements. Here we attempt to predict the future h-indexof scientists on the basis of features found in most CVs.

Jasleen Kaur, Diep Thi Hoang, Xiaoling Sun, Lino Possamai, MohsenJafariAsbagh, Snehal Patil, Filippo Menczer (2012)
Scholarometer:A Social Framework for Analyzing Impact across Disciplines
PLoS ONE, 7(9), 12 September 2012
info:doi/10.1371/journal.pone.0043235
From the Abstract: We describe a system called Scholarometer, whichprovides a service to scholars by computing citation-based impactmeasures. This creates an incentive for users to provide disciplinaryannotations of authors, which in turn can be used to computedisciplinary metrics. We first present the system architecture andseveral heuristics to deal with noisy bibliographic and annotationdata. We report on data sharing and interactive visualization servicesenabled by Scholarometer. Usage statistics, illustrating the datacollected and shared through the framework, suggest that the proposedcrowdsourcing approach can be successful. Secondly, we illustrate howthe disciplinary bibliometric indicators elicited by Scholarometerallow us to implement for the first time a universal impact measureproposed in the literature. Our evaluation suggests that this metricprovides an effective means for comparing scholarly impact acrossdisciplinary boundaries.

Daniel Torres-Salinas, Nicolas Robinson-Garcia, and Emilio Lopez-Cozar(2012)
Towards a BookPublishers Citation Reports. First approach using the Book CitationIndex
Hacia un ranking bibliométrico de editoriales científicas de libros. Primera aproximación utilizando el Book Citation Index
arXiv.org > cs > arXiv:1207.7067, 29 Jul 2012,Grupo Evaluación de la Ciencia y la Comunicación Científica (EC3)Working Papers 7. In Revista española de Documentación Científica, Vol 35, No 4 (2012)http://redc.revistas.csic.es/index.php/redc/article/view/766
doi:10.3989/redc.2012.4.1010
Abstract: The absence of books and book chapters in the Web of ScienceCitation Indexes (SCI, SSCI and A&HCI) has always beenconsidered an important flaw but the Thomson Reuters 'Book CitationIndex' database was finally available in October of 2010 indexing29,618 books and 379,082 book chapters. The Book Citation Index opens anew window of opportunities for analyzing these fields from abibliometric point of view. The main objective of this article is toanalyze different impact indicators referred to the scientificpublishers included in the Book Citation Index for the Social Sciencesand Humanities fields during 2006-2011. This way we construct what wehave called the 'Book Publishers Citation Reports'. For this, wepresent a total of 19 rankings according to the different disciplinesin Humanities & Arts and Social Sciences & Law with sixindicators for scientific publishers

Joseph Bernstein and Chancellor Gray (2012)
ContentFactor: A Measure of a Journal's Contribution to Knowledge
PLoS ONE, 7(7), 23 Jul 2012
info:doi/10.1371/journal.pone.0041554
From the Abstract: We propose a metric, Content Factor, and examineits performance among leading medical and orthopaedic surgery journals.To remedy Impact Factor's emphasis on recent citations, Content Factorconsiders the total number of citations, regardless of the year inwhich the cited paper was published. To correct for Impact Factor'semphasis on efficiency, no denominator is employed. Content Factor isthus the total number of citations in a given year to all of the paperspreviously published in the journal. We found that Content Factor andImpact Factor are poorly correlated. We further surveyed 75 experiencedorthopaedic authors and measured their perceptions of the importanceof various orthopaedic surgery journals. The correlation between theimportance score and the Impact Factor was only 0.08; the correlationbetween the importance score and Content Factor was 0.56.

Paul Wouters and Rodrigo Costas (2012)
Users,narcissism and control - tracking the impact of scholarly publicationsin the 21st century
SURFfoundation, February 2012
From the Executive summary: This report explores the explosion oftracking tools that have accompanied the surge of web based informationinstruments. The report therefore advises to start a concerted researchprogramme in the dynamics, properties, and potential use of new webbased metrics which relates these new measures to the alreadyestablished indicators of publication impact. Its goal would be tocontribute to the development of more useful tools for the scientificand scholarly community. This programme should monitor at least thefollowing tools: F1000, Microsoft Academic Research, Total-Impact,PlosONE altmetrics, and Google Scholar. The programme should moreoverdevelop the following key research themes: concepts of new web metricsand altmetrics; standardisation of tools and data; and the use andnormalisation of the new metrics.

Cynthia Lokker, R. Brian Haynes, Rong Chu, K. Ann McKibbon, Nancy L.Wilczynski, and Stephen D. Walter (2012)
Howwell are journal and clinical article characteristics associated withthe journal impact factor? a retrospective cohort study
Journal of the MedicalLibrary Association, 100 (1), 28-33, January 2012. ViaPubMed Central
From the Abstract: A retrospective cohort study determined the abilityof clinical article and journal characteristics, including appraisalmeasures collected at the time of publication, to predict subsequentJIFs. Four of the 10 measures were significant in the regression model:number of authors, number of databases indexing the journal, proportionof articles passing methods criteria, and mean clinical newsworthinessscores. For the clinical literature, measures of scientific quality andclinical newsworthiness available at the time of publication canpredict JIFs with 60% accuracy.

Xuemei Li, Mike Thelwall, and Dean Giustini (2011)
Validatingonline reference managers for scholarly impact measurement
Scientometrics, 21December 2011
info:doi/10.1007/s11192-011-0580-x
(Subscription access required. Online preview.) Abstract: This paperinvestigates whether CiteULike and Mendeley are useful for measuringscholarly influence, using a sample of 1,613 papers published in Natureand Science in 2007. Traditional citation counts from the Web ofScience (WoS) were used as benchmarks to compare with the number ofusers who bookmarked the articles in one of the two free onlinereference manager sites. Statistically significant correlations werefound between the user counts and the corresponding WoS citationcounts, suggesting that this type of influence is related in some wayto traditional citation-based scholarly impact but the number of usersof these systems seems to be still too small for them to challengetraditional citation indexes.

William J. Sutherland, David Goulson, Simon G. Potts, Lynn V. Dicks(2011)
Quantifyingthe Impact and Relevance of Scientific Research
PLoS ONE, 6(11), e27537, November 16, 2011
info:doi/10.1371/journal.pone.0027537
Abstract: Qualitative and quantitative methods are being developed tomeasure the impacts of research on society, but they suffer fromserious drawbacks associated with linking a piece of research to itssubsequent impacts. We have developed a method to derive impact scoresfor individual research publications according to their contribution toanswering questions of quantified importance to end users of research.To demonstrate the approach, here we evaluate the impacts of researchinto means of conserving wild bee populations in the UK. For publishedpapers, there is a weak positive correlation between our impact scoreand the impact factor of the journal. The process identifiespublications that provide high quality evidence relating to issues ofstrong concern. It can also be used to set future research agendas.

Perry Evans and Michael Krauthammer (2011)
Exploringthe Use of Social Media to Measure Journal Article Impact
American MedicalInformatics Association (AMIA) Annual Symposium Proceedings Archive,374-81, 22 October 2011. Via PubMed Central
From the Abstract: Using Wikipedia as a proxy for other social media,we explore the correlation between inclusion of a journal article inWikipedia, and article impact as measured by citation count. We startby cataloging features of PubMed articles cited in Wikipedia. We findthat Wikipedia pages referencing the most journal articles are aboutdisorders and diseases, while the most referenced articles in Wikipediaare about genomics. We note that journal articles in Wikipedia havesignificantly higher citation counts than an equivalent random articlesubset. We also observe that articles are included in Wikipedia soonafter publication. Our data suggest that social media may represent alargely untapped post-publication review resource for assessing paperimpact.

Merceur, F., Le Gall, M. and Salaun, A. (2011)
Bibliometrics:a new feature for institutional repositories
Archimer, Ifremer's institutional repository, May 2011. In14th BiennalEURASLIC Meeting, Lyon, 17-20 May, 2011
From the Abstract: In addition to its promotion and conservation objectives,Archimer, Ifremers institutional repository, offers a wide range ofbibliometric tools described in this document.

Herb, U. (2010)
OpenAccessStatistics: Alternative Impact Measures for Open Access documents? Anexamination how to generate interoperable usage information fromdistributed Open Access services
E-LIS, 25 Sep 2010. In:L'information scientifique et technique dans l'univers numérique.Mesures et usages. L'association des professionnels de l'information etde la documentation, ADBS, pp. 165-178
From the abstract: Thiscontribution shows that most common methods to assess the impact ofscientific publications often discriminate Open Access publications and by that reduce the attractiveness of Open Access for scientists.Assuming that the motivation to use Open Access publishing services(e.g. a journal or a repository) would increase if these services wouldconvey some sort of reputation or impact to the scientists, alternativemodels of impact are discussed. Prevailing research results indicatethat alternative metrics based on usage information of electronicdocuments are suitable to complement or to relativize citation-basedindicators. Furthermore an insight into the project OpenAccess-Statistics OA-S is given. OA-S implemented an infrastructure to collectdocument-related usage information from distributed Open AccessRepositories in an aggregator service in order to generateinteroperable document access information according to three standards(COUNTER, LogEc and IFABC).

Priem, J. and Hemminger, B. (2010)
Scientometrics2.0: Toward new metrics of scholarly impact on the social Web
First Monday, 15 (7), Jul 2010

Herb, U., Kranz, E., Leidinger, T. and Mittelsdorf, B. (2010)
Howto assess the impact of an electronic document? And what does impactmean anyway?: Reliable usage statistics in heterogeneous repositorycommunities
E-LIS, 10 Jun 2010. InOCLCSystems & Services, Vol. 26, No. 2, 2010, 133-145
Fromthe Abstract: Purpose - Usually the impact of research and researchersis quantified by using citation data: either by journal-centeredcitation data as in the case of the journal impact factor (JIF) or byauthor-centered citation data as in the case of the Hirsch- or h-index.This paper aims to discuss a range of impact measures, especiallyusage-based metrics, and to report the results of two surveys.Originality/value - This paper delineates current discussions aboutcitation-based and usage-based metrics. Based on the results of thesurveys, it depicts which functionalities could enhance repositories,what features are required by scientists and information professionals,and whether usage-based services are considered valuable. These resultsalso outline some elements of future repository research.

Repanovici, A. (2010)
Measuringthe visibility of the University's scientific production usingGoogleScholar, "Publish or Perish" software and Scientometrics
76th IFLA GeneralConference and Assembly, Gothenburg, Sweden, 07Jun 2010
InPerformance Measurement and Metrics, Vol. 12, No. 2, 2011. Fromthe Abstract: The first Romanian institutional repository wasimplemented at Transilvania University of Brasov. As part of theundertaken research, the visibility and the impact of the university'sscientific production was measured using the scientific methods ofscientometry, as a fundamental instrument for determining theinternational value of an university as well as for the statisticalevaluation of scientific research results. The results showed that anopen access institutional repository would significantly add to thevisibility of the university's scientific production. In this article wedefine the scientific production and productivity and present the mainindicators for the measurement of the scientific activity. GoogleScholar was used as a scientometric database which can be consultedfree of charge on the Internet and which indexes academic papers frominstitutional repositories, identifying also the referenced citations.The free Publish or Perish software can be used as an analysisinstrument for the impact of the research, by analysing the citationsthrough the h-index. We present the methodology and the results of anexploratory study made at the Transilvania University of Brasovregarding the h-index of the academic staff.

Neff, B. and Olden, J. (2010)
NotSo Fast: Inflation in Impact Factors Contributes to ApparentImprovements in Journal Quality
BioScience,60 (6), 455-9, June 2010
From the Abstract: Here we propose that impact factors may be subject toinflation analogous to changes in monetary prices in economics. Thepossibility of inflation came to light as a result of the observationthat papers published today tend to cite more papers than thosepublished a decade ago. We analyzed citation data from 75,312 papersfrom 70 ecological journals published during 1998-2007. We found thatpapers published in 2007 cited an average of seven more papers thanthose published a decade earlier. This increase accounts for about 80%of the observed impact factor inflation rate of 0.23. In examining the70 journals we found that nearly 50% showed increases in their impactfactors, but at rates lower than the background inflation rate.Therefore, although those journals appear to be increasing in qualityas measured by the impact factor, they are actually failing to keeppace with inflation.

Repanovici, A. (2010)
Measuringthe visibility of the universities scientific production usingscientometric methods
6th WSEAS/IASMEInternational Conference on Educational Technologies(EDUTE '10), 03 May 2010
Abstract:Paper presents scientometry as a science and a fundamental instrumentfor determining the international value of an university as well as forthe statistical evaluation of scientific research results. The impactof the research measurable through scientometric indicators isanalyzed. Promoting the scientific production of universities throughinstitutional digital repositories deals with the concept of scientificproduction of the university and the development of scientific researchin information society. These concepts are approached through the prismof marketing methods and techniques. The digital repository is analyzedas a PRODUCT, destined for promoting, archieving and preservingscientific production. Find out more about the author and the paperhere.
Therecord and abstract page for the paper does not currently link to the full text.

Wardle, D. (2010)
DoFaculty of 1000 (F1000) ratings of ecological publications serve asreasonable predictors of their future impact?
Ideas in Ecology andEvolution, 3, 2010, 11-15
Commentary article.
From the Abstract: There is an increasing demand for aneffective means of post-publication evaluation of ecological work thatavoids pitfalls associated with using the impact factor of the journalin which the work was published. One approach that has been gainingmomentum is the Faculty of 1000 (hereafter F1000) evaluationprocedure, in which panel members identify what they believe to be themost important recent publications they have read. Here I focused on1530 publications from 7 major ecological journals that appeared in2005, and compared the F1000 rating of each publication with thefrequency with which it was subsequently cited. ... Possible reasonsfor the F1000 process failing to identify high impact publications mayinclude uneven coverage by F1000 of different ecological topics,cronyism, and geographical bias favoring North American publications.As long as the F1000 process cannot identify those publications thatsubsequently have the greatest impact, it cannot be reliably used as ameans of post-publication evaluation of the ecological literature.

Bornmann, L. and Daniel, H.-D. (2010)
Thecitation speed index: A useful bibliometric indicator to add to the hindex
Authors' server, undated but notice posted to Sigmetrics listserv 26March 2010, inJournalof Informetrics, accepted for publication
This topic would appear to be a natural complement to OA citationeffects, but the paper does not mention any.

Horwood, L. and Robertson, S. (2010)
Roleof bibliometrics in scholarly communication
VALA2010 15th Biennial Conference and Exhibition, Melbourne, 9Feb 2010

Bar-Ilan, J. (2009)
ACloser Look at the Sources of Informetric Research
CYBERmetrics, 13 (1), 23 Dec 2009
Fromthe introduction: The Web has existed for twenty years only, yet thelarge majority of the data sources for informetric research areavailable through the Web. ISI's Web of Science was launched in 1997... In November 2004 two additional major citation databases appearedon the Web: Elsevier's Scopus and Google Scholar ... and there areeasily accessible and often open-source software tools that enable tocollect and analyze large quantities of data even on a personalcomputer. It has become easy to conduct "desktop or poor-man'sbibiliometrics". The data for informetric research have never beenperfect, but now that informetric analysis can be conducted with muchgreater ease than before, it is even more important to understand thelimitations and problems of data sources and methods and to assess thevalidity of the results. In the following sections I discuss somelimitations of the existing sources.

Gonzalez-Pereira, B., Guerrero-Bote, V., Moya-Anegon, F. (2009)
The SJRindicator: Anew indicator of journals' scientific prestige
ArXiv,arXiv:0912.4141v1 [cs.DL], 21 Dec 2009

Ball, K. (2009)
TheIndexing of Scholarly Open Access Business Journals
Electronic Journal of Academic and Special Librarianship,10 (3), Winter 2009
"thisstudy focusses on the business and management field and assess theextent to which scholarly open access journals in this discipline arecurrently being indexed by both commercial and non-commercial indexingservices. Of the commercial indexing services, Ebscos Business SourceComplete covers by far the largest number of open access journals. Forbusiness researchers working in an academic environment, BusinessSource Complete, with its more sophisticated searching and browsingcapabilities and deeper historical coverage, is probably the bestone-stop option for retrieving scholarly materials from both thesubscription-based and OA literature. However, from a simple quantityperspective, OA business journals are being most extensively indexed byOA indexing services, in particular, Google Scholar and Open J-Gate."

Neylon, C. and Wu, S. (2009)
Article-LevelMetrics and the Evolution of Scientific Impact
PLoS Biology,7 (11), 17 Nov 2009

Moed, H. (2009)
Measuringcontextual citation impact of scientific journals
ArXiv, arXiv:0911.2632v1 [cs.DL], 13 Nov 2009. Also inJournalof Informetrics (to appear)
About journal impact, and not directly about open access. From the abstract:"This paper explores a new indicator of journal citation impact,denoted as source normalized impact per paper (SNIP). It measures ajournal's contextual citation impact, taking into accountcharacteristics of its properly defined subject field, especially thefrequency at which authors cite other papers in their reference lists,the rapidity of maturing of citation impact, and the extent to which adatabase used for the assessment covers the field's literature. It aimsto allow direct comparison of sources in different subjectfields."

Armbruster, C. (2009)
WhoseMetrics? On Building Citation, Usage and Access Metrics as Information Servicefor Scholars
SSRN Social Science Research Network, 31 Aug 2009
Services mentioned: Journal impact factor, journal usage factor, GoPubMed, SSRNCiteReader, RePEc LogEc, RePEc CitEc, SPIRES, Harzing POP, Webometrics, ISI Webof Knowledge, Scopus, Google Scholar, Citebase, CiteSeer X, CERIF

Stock, W. (2009)
TheInflation of Impact Factors of Scientific Journals
ChemPhysChem, 10 (13), 17 Aug 2009, 2193-6

Patterson, M. (2009)
PLoS Journals measuringimpact where it matters
PLoS blog, 2009-07-13
On why PLoS is will no longer highlight the journal impact factor.Instead itwill present a range of metrics focussed on the publishedpaper, includingindividual citation counts from various sources, blog and bookmarkcounts,links and searches, asillustratedby Peter Binfield in the PLoS one community blog (March 31, 2009):"rather than updating the PLoS Journal sites with the new numbers,wevedecided to stop promoting journal impact factors on our sites alltogether.Its time to move on, and focus efforts on more sophisticated, flexibleand meaningful measures."

Canos Cerda, J. H., Campos, M. L. and Nieto, E. M. (2009)
What's Wrong withCitation Counts?

D-Lib Magazine, Vol. 15 No. 3/4, March/April 2009
From the abstract: "We argue that a new approach based on thecollection ofcitation data at the time the papers are created can overcome currentlimitations, and we propose a new framework in which the researchcommunity isthe owner of a Global Citation Registry characterized by high qualitycitationdata handled automatically."

Cross, J. (2009)
Impactfactors - the basics
The E-Resources Management Handbook (2006 - present), UKSG, this chapterpublished online: 03 February 2009

Leydesdorff, L. (2008)
How are newcitation-based journal indicators adding to the bibliometric toolbox?
Author preprint, undated, (announced 31 Oct 2008), inJournal of theAmerican Society for Information Science and Technology, Vol. 60, No. 7,2009, 1327-1336, published online: 2 Feb 2009http://dx.doi.org/10.1002/asi.21024
From the abstract: "The launching of Scopus and Google Scholar, andmethodological developments in social-network analysis have made many moreindicators for evaluating journals available than the traditional impactfactor, cited half-life, and immediacy index of the ISI. In this study, thesenew indicators are compared with one another and with the older ones."

FinalImpact: What Factors Really Matter? (VIDEO) (2008)
Scholarly Communication Program, Columbia University, October 30, 2008
Panelists: Marian Hollingsworth, Thomson Reuters; Jevin West,Eigenfactor.org; and Johan Bollen, Los Alamos National Laboratory

Radicchi, F., Fortunato, S. and Castellano, C. (2008)
Universality of citationdistributions: towards an objective measure of scientific impact
arXiv.org, arXiv:0806.0974v2 [physics.soc-ph], 5 Jun 2008 (v1), last revised 27Oct 2008
inProceedings of the National Academy of Sciences of The United States ofAmerica, 105 (45): 17268-17272, Nov. 11 2008



Banks, M. A. and Dellavalle, R. (2008)
Emerging Alternatives tothe Impact Factor
E-LIS, 05 September 2008, also inOCLCSystems & Services, 24(3)

Brumback, R. A. (2008)
Worshiping false idols:the impact factor dilemma
J. Child Neurol., Vol. 23, No. 4, April 2008, 365-367
"the opacity in Thomson Scientific's refusal to reveal the details of theircalculations only serves to increase suspicion about possible datamanipulations. ... Now would seem to be the appropriate time for the academiccommunity to demand valid metrics to assess published scientific material"




Althouse, B. M., West, J. D., Bergstrom, T. C. and Bergstrom, C. T. (2008)
Differences inImpact Factor Across Fields and Over Time
eScholarship Repository, California Digital library, Department of Economics,University of California Santa Barbara, Departmental Working Papers, paper2008-4-23, April 23, 2008. InJournal of the American Society for Information Science and Technology, Vol. 60, No. 1, 27-34, published online: 21 Aug 2008

Kosmopoulos, C. et Pumain, D. (2007)
Citation, Citation, Citation:Bibliometrics, the web and the Social Sciences and Humanities
Cybergeo, Science et Toile, article 411, mis en ligne le 17 decembre2007, modifie le 18 janvier 2008
From the abstract: "The paper reviews the main (bibliometric) data bases andindicators in use. It demonstrates that these instruments give a biasedinformation about the scientific output of research in Social Sciences andHumanities."


Rossner, M., Van Epps, H. and Hill, E. (2007)
Show me the data(editorial)
The Journal of Cell Biology, Vol. 179,No. 6, 1091-1092, published online December 17, 2007
"Just as scientists would not accept the findings in a scientific paper withoutseeing the primary data, so should they not rely on Thomson Scientific's impactfactor, which is based on hidden data. As more publication and citation databecome available to the public through services like PubMed, PubMed Central,and Google Scholar�, we hope that people will begin to develop their ownmetrics for assessing scientific quality rather than rely on an ill-defined andmanifestly unscientific number."



Citrome, L. (2007)
ImpactFactor? Shmimpact Factor! The Journal Impact Factor, Modern Day LiteratureSearching, and the Publication Process
Psychiatry, 4(5):54-57, 2007

Bornmann, L. and Daniel, H.-D. (2007)
Whatdo citation counts measure? A review of studies on citing behavior
Author eprint, undated,Journal of Documentation, accepted forpublication

Meho, L. I. (2006)
The Rise and Rise of CitationAnalysis
Author eprint, dLIST, 31 December 2006,Physics World, January 2007
"Provides a historical background of citation analysis, impact factor, newcitation data sources (e.g., Google Scholar, Scopus, NASA's Astrophysics DataSystem Abstract Service, MathSciNet, ScienceDirect, SciFinder Scholar,Scitation/SPIN, and SPIRES-HEP), as well as h-index, g-index, and a-index."

Electronic Publishing Services and Oppenheim, C. (2006)
UK scholarly journals:2006 baseline report: An evidence-based analysis of data concerning scholarlyjournal publishing, see Area 4: Citations, impact factors and their role
Research Information Network, Research Councils UK and the Department of Trade& Industry, October 3, 2006

Ewing,J. (2006)
MeasuringJournals
Notices of the AMS, Vol. 53, No. 9, October 2006, 1049-1053
"in many respects usage statistics are even more flawed than the impact factor,and once again, the essential problem is that there are no explicit principlesgoverning their interpretation. ... while usage statistics are onlyslightly useful, their misuse can beenormously damaging."

Garfield, E. (2006)
Commentary:Fifty years of citation indexing
International Journal of Epidemiology, 2006 35(5):1127-1128, publishedonline September 19, 2006

PLoS Medicine Editors (2006)
TheImpact Factor Game: It is time to find a better way to assess the scientificliterature
PLoS Medicine, Vol. 3, No. 6, June 2006

Altbach, P. G. (2006)
The Tyranny ofCitations
Inside Higher Ed, May 8, 2006

Noruzi, A. (2006)
The Web Impact Factor: acritical review (pdf, 10pp)
E-LIS, February 9, 2006, inThe Electronic Library, 24 (2006)
"Web Impact Factor (WIF) is a quantitative tool for evaluating and ranking websites ... search engines provide similar possibilities for the investigation oflinks between web sites/pages to those provided by the academic journalscitation databases from the Institute of Scientific Information (ISI). But thecontent of the Web is not of the same nature and quality as the databasesmaintained by the ISI."

Bollen, J., Rodriguez, M. A. and Van de Sompel, H. (2006)
Journal Status (pdf, 16pp)
Arxiv, 9 January 2006
"By merely counting the amount of citations and disregarding the prestige ofthe citing journals, the ISI IF is a metric of popularity, not of prestige. Wedemonstrate how a weighted version of the popular PageRank algorithm can beused to obtain a metric that reflects prestige. ... Furthermore, we introducethe Y-factor which is a simple combination of both the ISI IF and the weightedPageRank, and find that the resulting journal rankings correspond well to ageneral understanding of journal status."

Moed, H.F. (2005)
Citation analysis ofscientific journals and journal impact measures
Current Science, 89 (12): 1990-1996, December 25, 2005

Dong, P., Loh, M. and Mondry, A. (2005)
The "impact factor"revisited
Biomedical Digital Libraries, December 2005
This is a review, so the findings are not new, but this is perhaps the firstsuch paper to reflect on the effect of free and online availability on journalimpact factors, among other IF-related issues.

Hardy, R., Oppenheim, C., Brody, T. and Hitchcock, S. (2005)
Open Access CitationInformation (.doc, 105pp)
Author eprint, November 11, 2005, JISC Committee for the InformationEnvironment (JCIE) Scholarly Communication Group, September 2005
Describes a proposal to increase the exposure of open access materials andtheir references to indexing services, and to motivate new services by reducingsetup costs.

Perkel, J. M. (2005)
TheFuture of Citation Analysis
The Scientist, Vol. 19, No. 20, October 24, 2005
"The challenge is to track a work's impact when published in nontraditionalforms"

Monastersky, R. (2005)
Impact Factors RunInto Competition
Chronicle of Higher Education, October 14, 2005




Garfield, E. (2005)
The Agonyand the Ecstasy — The History and Meaning of the Journal Impact Factor(pdf, 22pp)
International Congress on Peer Review and Biomedical Publication,Chicago, September 16, 2005
Garfield's typically dry, data-filled but essential take on JIFs.

Publishers promote impact factors of OA journals
BioMed Central "Openaccess journals get impressive impact factors" 23 June 2005
Public Library of Science "The first impact factorfor PLoS Biology - 13.9" 27 June 2005
See also this discussion of these announcements on SPARC Open Access Forum,prompted by Elsevier's response fromTony McSean,followed byDavid Goodman,CharlesBailey, (both 8 July) andMatthewCockerill (10 July), or see this summary of the discussion: "BMC'sImpact Factors: Elsevier's Take and Reactions to It", Digital Koans(Charles Bailey's Weblog), 11 July 2005, including Peter Suber's conclusion:"It's important to distinguish the citation impact of an individual articlefrom a journal impact factor. The BMC-Elsevier debate is about the latter. ButOA is more likely to rise and fall according to the former."

Abbasi, K. (2004)
Let's dumpimpact factors
BMJ, Vol. 329, 16 October 2004
BMJRapidResponses to this editorial; also see this listresponse

Baudoin, L., Haeffner-Cavaillon, N., Pinhas, N., Mouchet, S. and Kordon, C.(2004)
Bibliometricindicators: realities, myth and prospective (abstract only, full paper inFrench)
Med Sci (Paris), 20 (10):909-15, October 2004

Jacsó, P. (2004)
TheFuture of Citation Indexing - Interview with Dr. Eugene Garfield (pdf 3pp)
Author eprint, inOnline, January 2004

Cockerill, M. J. (2004)
Delayed impact: ISI'scitation tracking choices are keeping scientists in the dark
BMC Bioinformatics 2004, 5:93, 12 July 2004

Shin E. J. (2003)
Do ImpactFactors change with a change of medium? A comparison of Impact Factors whenpublication is by paper and through parallel publishing (abstract only)
Journal of Information Science, 29 (6): 527-533, 2003
"it is found that Impact Factors of (journals from the period) 2000 and 2001were significantly higher than those of 1994 and 1995 in the journals publishedby parallel publishing (combination journals—simultaneous publication of paperand electronic journals). In particular, the Impact Factors of the combinationjournals increased after the journals transformed their available media frompaper journals to combination ones."

Walter, G., Bloch, S., Hunt, G. and Fisher, K. (2003)
Countingon citations: a flawed way to measure quality
MJA, 2003, 178 (6): 280-281

Borgman, C. L. and Furner, J. (2002)
ScholarlyCommunication and Bibliometrics, author preprint (pdf 45pp)
Author eprint, inAnnual Review of Information Science and Technology,Vol. 36, edited by B. Cronin, 2002

Guédon, J.-C. (2001)
In Oldenburg'sLong Shadow: Librarians, Research Scientists, Publishers, and the Control ofScientific Publishing
Creating the Digital Future, Proceedings of the 138th Annual Meeting,Association of Research Libraries, Toronto, Ontario, May 23-25, 2001

Garfield, E. (1999)
Journal impact factor:a brief review
CMAJ, 161 (8), October 19, 1999

Wouters, P. (1999)
The CitationCulture (pdf 290pp)
PhD Thesis, University of Amsterdam, 1999

Garfield, E. (1998)
Theuse of journal impact factors and citation analysis in the evaluation ofscience
Author eprint, presented at the41st Annual Meeting of the Council ofBiology Editors, Salt Lake City, UT, May 4, 1998

Seglen, P. O. (1997)
Why theimpact factor of journals should not be used for evaluating research
BMJ, 314:497, 15 February 1997

Garfield, E. (1973)
CitationFrequency as a Measure of Research Activity and Performance (pdf 3pp)
Author eprint, inEssays of an Information Scientist, 1: 406-408,1962-73,Current Contents, 5, January 31, 1973

Garfield, E. (1955)
CitationIndexes for Science: A New Dimension in Documentation through Association ofIdeas
Author eprint, inScience, Vol:122, No:3159, p.108-111, July 15, 1955

Open access

Notes. In printed form, little of the published research literature wasfree. With more material beginning to appear on the Web from the mid-1990s,more became freely available. Open access is in a sense a formalisation of thatprocess, a recognition that all published, refereed scholarly papers could andshould be freely accessible in some form to everyone online withoutcompromising the quality and integrity of the literature. That is the goal.This simple idea, especially when focussed on this very specific literature,seems to have been quite difficult to grasp for many bound by the old,pre-online ways of thinking. Despite the often antithetical tone of the debate,progress has been rapid since the landmark of the Budapest Open AccessInitiative in February 2002, even impinging on prospective government policiesby 2003 (e.g.Martin Sabo'sPublic Access to Science Act;UKHouse committee releases its report on open access;Majordevelopment in providing OA to taxpayer-funded research). It has all beenbrilliantly logged by Peter Suber inOpen Access News(http://www.earlham.edu/~peters/fos/fosblog.html),but for a very quick overview the following papers are sufficient.

Suber, P. (updated)
Open AccessOverview

Swan, A. (2007)
OpenAccess and the Progress of Science
American Scientist, April-June 2007
Swan justified open access in support of her 'progress' article in a listdiscussion. Seebloggedextracts from that discussion.

Swan, A. (2006)
OpenAccess: Why should we have it?
presented at "Zichtbaar onderzoek. Kan Open Archives daarbij helpen?" / Visibleresearch. Can OAI help? Organised by AWI (Flemish Ministry for Economy,Enterprise, Science, Innovation and Foreign Trade) and VOWB (FlemishOrganisation of Scientific Research Libraries), May 2006

Swan, A. (2005)
Open Access
JISC, Briefing Paper, 1 April 2005

Suber, P. (2004)
A Primer on OpenAccess to Science and Scholarship
Author eprint, inAgainst the Grain, Vol. 16, No. 3, June 2004

Harnad, S. (2004)
The Green Roadto Open Access: A Leveraged Transition
American Scientist Forum, January 07, 2004

Suber, P. (2003)
Removing the Barriersto Research: An Introduction to Open Access for Librarians
Author eprint, inCollege & Research Libraries News, 64, February,92-94, 113




[8]ページ先頭

©2009-2026 Movatter.jp