MEDLINE: Difference between revisions
imported>Robert Badgett |
imported>Robert Badgett (→Methods to access MEDLINE: Added SUMSearch (does this seem objective - note that I developed SUMSearch so please edit if this does not seem correct)) |
||
Line 52: | Line 52: | ||
===Ovid=== | ===Ovid=== | ||
* | * [http://www.ovid.com/site/help/gateway/tech_tips.jsp Ovid Searching Tips] | ||
===SUMSearch=== | |||
HubMed (http://sumsearch.uthscsa.edu/) is a federated medical search engine. It does not maintain its own copy of MEDLINE, but rather queries PubMed and revises searches too few or too many citations are retrieved. At the same time, SUMSearch queries the National Guidelines Clearinghouse, DARE, WikiPedia, and other resources. | |||
==References== | ==References== |
Revision as of 10:03, 1 October 2008
According to the U.S. National Library of Medicine, "MEDLINE® (Medical Literature Analysis and Retrieval System Online) is the U.S. National Library of Medicine's® (NLM) premier bibliographic database that contains over 16 million references to journal articles in life sciences with a concentration on biomedicine. A distinctive feature of MEDLINE is that the records are indexed with NLM's Medical Subject Headings (MeSH®)."[1]
PubMed is the National Library of Medicine's free online search system for MEDLINE.
Structure
MEDLINE® (Medical Literature Analysis and Retrieval System Online) is a database of predominantly biomedical bibliographic citations maintained by the U.S. National Library of Medicine (NLM).[2] Each citation includes bibliographic data, abstract if available, links to full text of the article and keywords. The keywords are indexed with the NLM's Medical Subject Headings (MeSH®)[3] and subheadings[4].
The National Library of Medicine is investigated whether indexing MeSH terms can be either fully or semi-automated.[5]
Methods to improve searching MEDLINE
There is much ongoing research into improving MEDLINE search results.
Research methods for comparative studies
In comparing search strategies, there are two experimental methods.
- If a complete test collection of articles is available that is already divided into articles of meeting inclusion criteria and articles that not meeting criteria, then each strategy is compared for its ability to successfully identify the articles meeting criteria (sensitivity) and to successfully exclude (specificity) the articles not meeting criteria. Sensitivity is also called "recall" by some authors.[6]
- If a partial test collection is available that only consists of articles meeting inclusion criteria (for example, article meeting inclusion criteria for ACP Journal Club[7] or articles included in a systematic review of a clinical topic or articles in an annotated bibliography[8]), then the sensitivity is again the proportion of relevant articles identified by the strategy. However, the specificity is not computable. Instead, one of several related measures are calculated. These measures are all based on the positive predictive value (PPV) of the strategy. Analogous to PPV used in diagnostic testing, the PPV directly correlates with the prevalence of relevant articles in the collection and thus is not stable across prevalences.[9]
- Precision is "the proportion of retrieved articles that meet criteria" and thus is the same as the PPV.[10][11]
- Hit curve "is the number of important articles among the first n results."[12][13]
- Number Needed to Read (NNR) is "how many papers in a journal have to be read to find one of adequate clinical quality and relevance."[14][15][9][16] Of note, the NNR has been proposed as a metric to help libaries to decide which journals to subscribe to.[14]
Citation tracking
Citation tracking may help identify relevant studies in MEDLINE.[17][18]
Clustering
Clustering search results may help.[19]
Filters (hedges)
MEDLINE filters are an optimal Boolean combination of search terms, both textword and MeSH terms, to search articles of particular types. For example, one filter is for identifying randomized controlled trials. Many MEDLINE filters have been developed by the Hedges team[20] supported by a grant from the National Library of Medicine.[21]
Relevancy ranking
Although MEDLINE is usually searched for exact matches using Boolean terms, relevancy ranking has been studied. In an early comparion, relevency ranking performed well; however, the Boolean version of MEDLINE did not fully use MeSH terms.[22][23]
Citation analysis or PageRank
There are conflicting results over the role of ranking results based on citation counts or PageRank.[16][13][8]
Machine learning
Machine learning methods in which the search engine seeks articles that more resemble the included articles, may be more accurate than Bolean methods.[7]
Methods to access MEDLINE
There are many third party interfaces to search MEDLINE such as OVID[24]. The National Library of Medicine's own search interface is PubMed (http://pubmed.gov).
PubMed
PubMed (http://pubmed.gov) is the National Library of Medicine's own free Internet access to MEDLINE. PubMed has been freely available since 1997.
EBMSearch
EBMSearch (http://ebmsearch.org/) maintains its own copy of MEDLINE and uses machine learning to rank articles.[7]
HubMed
HubMed (http://www.hubmed.org/) does not maintain its own copy of MEDLILNE, but rather uses PubMed's EUtils web service to retrieve MEDLINE records stored at PubMed.
Ovid
SUMSearch
HubMed (http://sumsearch.uthscsa.edu/) is a federated medical search engine. It does not maintain its own copy of MEDLINE, but rather queries PubMed and revises searches too few or too many citations are retrieved. At the same time, SUMSearch queries the National Guidelines Clearinghouse, DARE, WikiPedia, and other resources.
References
- ↑ MEDLINE Fact Sheet. National Library of Medicine. Retrieved on 2008-01-22.
- ↑ National Library of Medicine. MEDLINE Fact Sheet. Retrieved on 2007-11-09.
- ↑ National Library of Medicine. Medical Subject Headings (MESH®) Fact Sheet. Retrieved on 2007-11-09.
- ↑ Anonymous (2008). Qualifiers - 2008. National Library of Medicine. Retrieved on 2008-03-19.
- ↑ National Library of Medicine. Indexing Initiative. Retrieved on 2007-11-25.
- ↑ Hersh, William R. (2003). Information retrieval: a health and biomedical perspective. Berlin: Springer. ISBN 0-387-95522-4.
- ↑ 7.0 7.1 7.2 Aphinyanaphongs Y, Tsamardinos I, Statnikov A, Hardin D, Aliferis CF (2005). "Text categorization models for high-quality article retrieval in internal medicine". J Am Med Inform Assoc 12 (2): 207–16. DOI:10.1197/jamia.M1641. PMID 15561789. Research Blogging.
- ↑ 8.0 8.1 Herskovic JR, Bernstam EV (2005). "Using incomplete citation data for MEDLINE results ranking". AMIA Annu Symp Proc: 316–20. PMID 16779053. [e]
- ↑ 9.0 9.1 Bachmann LM, Coray R, Estermann P, Ter Riet G (2002). "Identifying diagnostic studies in MEDLINE: reducing the number needed to read". J Am Med Inform Assoc 9 (6): 653–8. PMID 12386115. [e]
- ↑ Haynes RB, Wilczynski NL (2004). "Optimal search strategies for retrieving scientifically strong studies of diagnosis from Medline: analytical survey". BMJ 328 (7447): 1040. DOI:10.1136/bmj.38068.557998.EE. PMID 15073027. Research Blogging.
- ↑ Zhang L, Ajiferuke I, Sampson M (2006). "Optimizing search strategies to identify randomized controlled trials in MEDLINE". BMC Med Res Methodol 6: 23. DOI:10.1186/1471-2288-6-23. PMID 16684359. PMC 1488863. Research Blogging.
- ↑ Herskovic JR, Iyengar MS, Bernstam EV (2007). "Using hit curves to compare search algorithm performance". J Biomed Inform 40 (2): 93–9. DOI:10.1016/j.jbi.2005.12.007. PMID 16469545. Research Blogging.
- ↑ 13.0 13.1 Bernstam EV, Herskovic JR, Aphinyanaphongs Y, Aliferis CF, Sriram MG, Hersh WR (2006). "Using citation data to improve retrieval from MEDLINE". J Am Med Inform Assoc 13 (1): 96–105. DOI:10.1197/jamia.M1909. PMID 16221938. Research Blogging.
- ↑ 14.0 14.1 Toth B, Gray JA, Brice A (2005). "The number needed to read-a new measure of journal value". Health Info Libr J 22 (2): 81–2. DOI:10.1111/j.1471-1842.2005.00568.x. PMID 15910578. Research Blogging.
- ↑ McKibbon KA, Wilczynski NL, Haynes RB (2004). "What do evidence-based secondary journals tell us about the publication of clinically important articles in primary healthcare journals?". BMC Med 2: 33. DOI:10.1186/1741-7015-2-33. PMID 15350200. Research Blogging.
- ↑ 16.0 16.1 Haase A, Follmann M, Skipka G, Kirchner H (2007). "Developing search strategies for clinical practice guidelines in SUMSearch and Google Scholar and assessing their retrieval performance". BMC Med Res Methodol 7: 28. DOI:10.1186/1471-2288-7-28. PMID 17603909. Research Blogging.
- ↑ Bakkalbasi N, Bauer K, Glover J, Wang L (2006). "Three options for citation tracking: Google Scholar, Scopus and Web of Science". Biomed Digit Libr 3: 7. DOI:10.1186/1742-5581-3-7. PMID 16805916. Research Blogging.
- ↑ Kuper H, Nicholson A, Hemingway H (2006). "Searching for observational studies: what does citation tracking add to PubMed? A case study in depression and coronary heart disease". BMC Med Res Methodol 6: 4. DOI:10.1186/1471-2288-6-4. PMID 16483366. Research Blogging.
- ↑ Lin Y, Li W, Chen K, Liu Y (2007). "A document clustering and ranking system for exploring MEDLINE citations". J Am Med Inform Assoc 14 (5): 651–61. DOI:10.1197/jamia.M2215. PMID 17600104. Research Blogging.
- ↑ Hedges Team. Search Strategies. Retrieved on 2007-11-25.
- ↑ CRISP - Computer Retrieval of Information on Scientific Projects, Abstract Display. Retrieved on 2007-11-25.
- ↑ Hersh WR, Hickam DH (1992). "A comparison of retrieval effectiveness for three methods of indexing medical literature". Am. J. Med. Sci. 303 (5): 292–300. PMID 1580316. [e]
- ↑ Hersh WR, Hickam DH, Haynes RB, McKibbon KA (1994). "A performance and failure analysis of SAPHIRE with a MEDLINE test collection". J Am Med Inform Assoc 1 (1): 51–60. PMID 7719787. [e]
- ↑ Anonymous. MEDLINE® - Ovid's MEDLINE. Retrieved on 2007-11-09.
External links
- PubMed
- PubMed for Handhelds
- PubMed usage statistics
- Entrez Programming Utilities
- Déjà vu: a Database of Duplicate Citations in the Scientific Literature (See Déjà vu--a study of duplicate citations in Medline PMID 18056062)