Skip to Main Content
Texas A & M Libraries Logo Research Guides

AI-Based Literature Review Tools

Selected AI-Based Literature Review Tools


Please keep these differences in mind when exploring AI-powered academic search engines.


Library Subscribed AI-Assisted Databases

  • Dimensions (TAMU)
    - Brings its linked multiple content types into a single place for improved global discovery and analysis.
    - Two AI features: Summarization, and Research GPT which provides access only to its open-access collections.

  • Statista (TAMU)
    - Contains international statistics, facts, and market data taken from free and proprietary sources
    - AI features: Use its Research Assistant to generate insights from queried data.
     

TAMU: AI Tools or Services


------------------------------------------------------------------------------------------------------------------------------------------------------

Independent Subscription AI Tools (Library Subscription Not Available)

  • The tools listed below may require an individual subscription, as TAMU libraries do not provide access to them.
  • Disclaimer:
    For information only. We recommend that you independently assess these tools and their usage methods.
     

SEMANTIC SCHOLAR

  • SCIENTIFIC LITERATURE SEARCH ENGINE - finding semantically similar research papers.
  • "A free, AI-powered research tool for scientific literature."  <https://www.semanticscholar.org/>. But login is required in order to use all functions.
  • Over 200 millions of papers from all fields of science, the data of which has also served as a wellspring for the development of other AI-driven tools.
  • Example - nursing student mental health "scoping review"

    The 4000+ results can be sorted by Fields of Study, Date Range, Author, Journals & Conferences

  • Save the papers in your Library folder. The Research Feeds will recommend similar papers based on the items saved.

  • Example - SERVQUAL: A multiple-item scale for measuring consumer perceptions of service quality
    Total Citations: 22,438   [Note: these numbers were gathered when this guide was created]
    Highly Influential Citations 2,001
    Background Citations 6,109
    Methods Citations 3,273 
    Results Citations 385

  • Semantic Reader
    "Semantic Reader is an augmented reader with the potential to revolutionize scientific reading by making it more accessible and richly contextual." It "uses artificial intelligence to understand a document’s structure and merge it with the Semantic Scholar’s academic corpus, providing detailed information in context via tooltips and other overlays." <https://www.semanticscholar.org/product/semantic-reader>.

  • Skim Papers Faster
    "Find key points of this paper using automatically highlighted overlays. Available in beta on limited papers for desktop devices only."  <https://www.semanticscholar.org/product/semantic-reader>. Press on the pen icon to activate the highlights.

  • TLDRs (Too Long; Didn't Read)
    Try this example. Press the pen icon to reveal the highlighted key points.
    TLDRs "are super-short summaries of the main objective and results of a scientific paper generated using expert background knowledge and the latest GPT-3 style NLP techniques. This new feature is available in beta for nearly 60 million papers in computer science, biology, and medicine..." <https://www.semanticscholar.org/product/tldr>

ELICIT.ORG

  • AI-POWERED RESEARCH ASSISTANT - finding papers, filtering study types, automating research flow, brainstorming, summarizing and more.
  • "Elicit is a research assistant using language models like GPT-3 to automate parts of researchers’ workflows. Currently, the main workflow in Elicit is Literature Review. If you ask a question, Elicit will show relevant papers and summaries of key information about those papers in an easy-to-use table."  <https://elicit.org/faq#what-is-elicit.>; Find answers from 175 million papers. FAQS
  • Example - How do mental health interventions vary by age group?   /   Fish oil and depression
    Results: [Login required]
    (1) Summary of top 4 papers > Paper #1 - #4 with Title, abstract, citations, DOI, and pdf
    (2) Table view: Abstract / Interventions / Outcomes measured / Number of participants
    (3) Relevant studies and citations.
    (4) Click on Search for Paper Information to find - Metadata about Sources (SJR etc.) >Population (age etc.) >Intervention (duration etc.) > Results (outcome, limitations etc.) and > Methodology (detailed study design etc.)
    (5) Export as BIB or CSV
  • How to Search / Extract Data / List of Concept Search
    -Enter a research question >Workflow: Searching > Summarizing 8 papers> A summary of 4 top papers > Final answers. Each result will show its citation counts, DOI, and a full-text link to Semantic Scholar website for more information such as background citations, methods citation, related papers and more.
    - List of Concepts search - e.g. adult learning motivation. The results will present a list the related concepts.
    - Extract data from a pdf file - Upload a paper and let Elicit extract data for you.
  • Export Results - Various ways to export results.
  • How to Cite - Includes the elicit.org URL in the citation, for example:
    Ought; Elicit: The AI Research Assistant; https://elicit.org; accessed xxxx/xx/xx
  • 12/27/2024 - Notebook, a new workflow to make collaborative research like systematic reviews easier. Search, chat, and extract from papers > Share notebooks with anyone, even those who don’t have accounts > Screen papers for systematic review with yes/no/maybe > Filter search results by journal quality or paper content >  Export search results to a RIS file.

CONSENSUS.APP

  • ACADEMIC SEARCH ENGINE- using AI to find insights in research papers.

  • "We are a search engine that is designed to accept research questions, find relevant answers within research papers, and synthesize the results using the same language model technology." <https://consensus.app/home/blog/maximize-your-consensus-experience-with-these-best-practices/>

    Over 200-millions scholarly documents / Peer-reviewed / Subjects: Medical sciences, physics to social sciences and economics.
  • Example - Does the death penalty reduce the crime?  /  Fish oil and depression  /  
    (1) Extracted & aggregated findings from relevant papers.
    (2) Results may include AIMS, DESIGN, PARTICIPANTS, FINDINGS or other methodological or report components.
    (3) Summaries and Full Text
  • How to Search
    Direct questions - Does the death penalty reduce the crime?
    Relationship between two concepts - Fish oil and depression / Does X cause Y?
    Open-ended concepts - effects of immigration on local economics
    Tips and search examples from Consensus' Best Practice   
  • Synthesize (beta) / Consensus Meter
    When the AI recognizes certain types of research questions, this functionality may be activated. It will examine a selection of some studies and provide a summary along with a Consensus Meter illustrating their collective agreement. Try this search: Is white rice linked to diabetes? The Consensus Meter reveals the following outcomes after analyzing 10 papers: 70% indicate a positive association, 20% suggest a possible connection, and 10% indicate no link.
  • Consensus Plugin within ChatGPT.
    To access scientific research directly within the ChatGPT interface for answering questions, finding papers, and creating content. https://consensus.app/home/blog/introducing-the-consensus-search-chatgpt-plugin/

    Prompt write me a paragraph about the impact of climate change on GDP with citations

OpenRead

  • https://www.openread.academy/
  • Accessed institutionally by Harvard, MIT, University of Oxford, Johns Hopkins, Stanford, and more...
  • AI-powered Academic Searching + Web Searching - Over 300 million papers and real-time web content.
  • Trending and Topics - Browse them to find the latest hot papers. Use Topic to select specific fields and then see their trending.
  • Each keyword search or AI query generates a synthesis report with citations. To adjust the search results, simply click on the Re-Generate button to refresh the report and the accompanied citations. After that click on Follow-Up Questions to go deeper into a specific area or subject.
  • Use Paper Q&A to interact with a text directly.
    Examples: "What does this paper say about machine translation?""What is C-1 in Fig.1?"
  • When you read a paper, under Basic Information select any of the following tools to get more information: Basic Information > Related Paper Graph> Paper Espresso > Paper Q&A, and > Notes. The Related Paper Graph will present the related studies in a visual map with relevancy indication by percentage.
  • Click on Translation to put a text or search results into another language.
  • Read or upload a document and let Paper Espresso analyze it for you. It will organize the content into a standard academic report format for easy reference: Background and Context > Research Objectives and Hypotheses > Methodology > Results and Findings > Discussion and Interpretation > Contributions to the field > Structure and Flow > Achievements and Significance, and > Limitations and Future Work.

CONNECTED PAPERS

  • RELATED STUDIES
  • Uses visual graphs or other ways to show relevant studies. The database is connected to the Semantic Scholar Paper Corpus which has compiled hundreds of millions of published papers across many science and social science fields.
  • See more details about how it works.
     
  • Example - SERVQUAL
    and then click on SELECT A PAPER TO BUILD THE GRAPH > The first paper was selected.
    Results:
    (1) Origin paper - SERVQUAL: A multiple-item scale for measuring consumer perceptions of service quality + Connected papers with links to Connected Papers / PDF / DOI or Publisher's site / Semantic Scholar / Google Scholar.
    (2) Graph showing the origin paper + connected papers with links to the major sources . See above.
    (3) Links to Prior Works and Derivative Works
    See the detailed citations by Semantic Scholar on the origin SERVQUAL paper on the top of this page within Semantic Scholars.
  • How to Search
    Search by work title.
    Enter some keywords about a topic.
  • Download / Save
    Download your saved Items in Bib format.

RESEARCH RABBIT

  • CITATION-BASED MAPPING: SIMILAR / EARLY / LATER WORKS
  • "100s of millions of academic articles and covers more than 90%+ of materials that can be found in major databases used by academic institutions (such as Scopus, Web of Science, and others)." See its FAQs page. Search algorithms were borrowed from NIH and Semantic Scholar.
  • The default “Untitled Collection” will collect your search histories, based on which Research Rabbit will send you recommendations for three types of related results: Similar Works / Earlier Works / Later Works, viewable in graph such as Network, Timeline, First Authors etc.

  • Zotero integration: importing and exporting between these two apps.

  • Example - SERVQUAL: A multiple-item scale for measuring consumer perceptions of service quality
    [Login required] Try it to see its Similar Works, Earlier Works and Later Works or other documents.
  • Export Results - Findings can be exported in BibTxt, RIS or CSV format.

Litmaps

  • A visualization-based literature review assistant.
  • 270+ million articles sourced from Semantic Scholar, Crossref, and OpenAlex which covers common datasets including PubMed, arXiv, bioRxiv, medRxiv, Web of Science, Scopus, Microsoft Academic Graph. For more information visit this document.

How Does Litmaps Work?

  • Seed: You enter the title or DOI of an article, which serves as the seed.
  • Seed Map: The database will generate a connection map showing how the seed paper cites or is cited by other papers.
  • Discovery: The connected lines and nodes in the map can be regrouped to facilitate further literature discovery.
     
  • Example
    "
    AI in Medical Imaging Informatics: Current Challenges and Future Directions" is a good paper for my topic on AI medical imaging,
  1. Search for this paper in Litmaps.
  2. Click on "Explore Related Articles" to generate a visual map of your initial set of related papers.
  3. Explore these papers, or click "More Like This" next to each result to discover additional relevant articles.
     

Scite.Ai

-----------------------------------------------

ChatGPT

  • Check the version and the Knowledge cutoff date; Check the context window size or Input/ Output length.
  • ChatGPTs responses are non-deterministic by default.
  • AI language models may generate fake references.
  • According to this study <https://arxiv.org/ftp/arxiv/papers/2304/2304.06794.pdf> "ChatGPT cites the most-cited articles and journals, relying solely on Google Scholar's citation counts" within the field of environmental science.
     
  • Example - "INTERVIEW WITH CHATGPT" as a Research Method & Teaching Tool
    Some researchers began to use this approach to obtain their research data. Try this Google Scholar search link "interview with ChatGPT"  or see two articles below:
    (1) Chatting about ChatGPT: how may AI and GPT impact academia and libraries? BD Lund, T Wang - Library Hi Tech News, 2023
    (2) An interview with ChatGPT: discussing artificial intelligence in teaching, research, and practice, G Scaringi, M Loche - 2023
     
  • Examples - RESEARCH QUESTION
    What are the effects of maternal smoking on infant mortality?

    Increased risk of sudden infant death syndrome (SIDS)
    Increased risk of premature birth
    Increased risk of low birth weight
    etc .......

  • Example -  SYNONYMS
    What are the synonyms of infant mortality?

    Infant death
    Neonatal mortality (referring specifically to deaths within the first 28 days of life)
    Perinatal mortality (referring to deaths occurring during the period from 22 completed weeks of gestation to 7 completed days after birth)
    etc.....
    [Press the Regenerate button to get more.]

  • Example -  RELATED WORDS
    What are the related words of infant mortality?

    Neonatal mortality, Post-neonatal mortality, Stillbirths, Low birth weight, Malnutrition, Infectious diseases, Vaccination, Maternal health,  Access to healthcare, Poverty, Social inequality, Sanitation, Hygiene, Water quality, Childbirth complications, Congenital abnormalities, Birth defects, Maternal age, Under-five mortality, Child mortality, Perinatal mortality, Preterm birth, Low birth weight, Sudden infant death syndrome (SIDS), Maternal mortality, Postnatal care, Malnutrition, Immunization, Access to healthcare, Clean water and sanitation, Congenital anomalies, Infant health disparities, Infant mortality rate (IMR), Infant survival.
     
  • Example of Searching with SUBJECT TERMS & KEYWORDS
    Show me a search string that combines the use of PubMed MeSH terms and keywords on the factors of infant mortality

     

    (("Infant Death"[Mesh] OR "Sudden Infant Death Syndrome"[Mesh] OR "Infant Mortality"[Mesh] OR "Perinatal Mortality"[Mesh] OR "Neonatal Mortality"[Mesh] OR "Fetal Mortality"[Mesh] OR "Infant Death" OR "Sudden Infant Death Syndrome" OR "Infant Mortality" OR "Perinatal Mortality" OR "Neonatal Mortality" OR "Fetal Mortality") AND ("Risk Factors"[Mesh] OR "Etiology"[Mesh] OR "Prevention and Control"[Mesh] OR "Interventions"[Mesh] OR "Public Health"[Mesh] OR "Health Disparities"[Mesh] OR "Healthcare Disparities"[Mesh] OR "Poverty"[Mesh] OR "Malnutrition"[Mesh] OR "Infection"[Mesh] OR "Congenital Abnormalities"[Mesh] OR "Prematurity"[Mesh] OR "Low Birth Weight"[Mesh] OR "Maternal Education"[Mesh] OR "Maternal Health"[Mesh] OR "Risk Factors" OR "Etiology" OR "Prevention and Control" OR "Interventions" OR "Public Health" OR "Health Disparities" OR "Healthcare Disparities" OR "Poverty" OR "Malnutrition" OR "Infection" OR "Congenital Abnormalities" OR "Prematurity" OR "Low Birth Weight" OR "Maternal Education" OR "Maternal Health"))

    Comments and Suggestions:
    #1 -This search string may not work on all database platforms. For example, Medline (OVID) may not be able to run it, Medline & CINAHL (EBSCO) can run it without any problems while PubMed will run it but it will indicate which MeSH terms are not indexed. See PubMed's search History and Details under Advanced.
    #2 -This is just one possible combination. You may need to adjust it based on your task and the databases you are using.
    #3 - It might be easier if you break the task into 3 parts: ask for synonyms > ask for thesauri terms > and ask it to combine them.

  • Examples - DATABASE RECOMMENDATIONS

    Heath Science - Which databases to use to find articles on the EFFECTS OF MATERNAL SMOKING ON INFANT DEATH?
    PubMed: This is a database of biomedical literature from the National Library of Medicine. It includes articles from medical journals and other sources on a wide range of health-related topics.
    Embase: This is a database of biomedical literature that covers more international and pharmaceutical content than PubMed. It includes articles from medical journals and other sources on a wide range of health-related topics.
    Cochrane Library: This is a collection of databases that provides systematic reviews and meta-analyses of healthcare interventions, including interventions related to maternal smoking and infant death.
    Scopus: This is a multidisciplinary database that covers research from a wide range of fields, including health sciences. It includes articles from scientific journals, books, and conference proceedings.
    Web of Science: This is a database that covers research from a wide range of fields, including health sciences. It includes articles from scientific journals, books, and conference proceedings.
    CINAHL: This is a database of nursing and allied health literature, including articles on maternal smoking and infant death.
    [Note: The response above was copied and pasted directly from ChatGPT. No editing was added.]

    Education -- I am doing a meta-analysis on k-12 STEM education. What databases should I use?
    ERIC: This is a database of education research and information, including articles from journals, reports, and conference papers. It is a commonly used database for educational research.
    Education Source: This is a comprehensive database of education research and information, including articles from journals, books, and conference papers.
    Web of Science: This is a multidisciplinary database that covers research from a wide range of fields, including education. It includes articles from scientific journals, books, and conference proceedings.
    Scopus: This is another multidisciplinary database that covers research from a wide range of fields, including education. It includes articles from scientific journals, books, and conference proceedings.
    PsycINFO: This is a database of psychology research, including articles from journals, books, and dissertations. It can be useful for research on STEM education that includes psychological factors.
    Cochrane Library: This is a collection of databases that provides systematic reviews and meta-analyses of healthcare interventions including interventions related to education.
    [Author's note: You could also ask for the GREY LITERATURE sources related to the topic above,]

  • Example: ACT-AS-AN-EXPERT PROMPT
    Here's an alternative way to consolidate all the questions into a single consultation by using this popular Act-As-An-Expert prompt, aka, system message prompt:
     I am going to conduct a systematic review on games and mathematics related to K-12 education  I want you to act as a systematic review expert. I will ask you some questions related to my research topic and you advise me on how to do it. 

  • [You may ask one question at a time]
    My first question is "which databases should I use?" 
    My second question is "what are the relevant subject terms, descriptors and key words?"
    My third question is "how to do data extraction?"
    ....., ....
    Please note that the above prompts are merely for illustrative purposes. Actual questions may contain more specific instructions. As an example, for the first question, could you provide guidance on how to identify relevant publications? Specifically, can you provide a list of the primary databases and grey literature sources related to my topic?
     

  • Example - GREY LITERATURE

    Please show me all the resources that offer grey literature for conducting research in the field of education
        ERIC (Education Resources Information Center): https://eric.ed.gov/
        OpenGrey: http://www.opengrey.eu/
        British Education Index: https://www.ebscohost.com/academic/british-education-index
        WorldCat: https://www.worldcat.org/
        ProQuest Dissertations & Theses Global: https://www.proquest.com/products-services/pqdtglobal.html
        EThOS (Electronic Theses Online Service): https://ethos.bl.uk/
        Institutional Repositories of universities and academic institutions (search for specific institution repositories)
        American Educational Research Association (AERA): https://www.aera.net/
        National Education Association (NEA): https://www.nea.org/
        Association for Supervision and Curriculum Development (ASCD): http://www.ascd.org/
        RAND Corporation: https://www.rand.org/
        National Center for Education Research (NCER): https://ies.ed.gov/ncer/
        National Center for Education Statistics (NCES): https://nces.ed.gov/
        European Educational Research Association (EERA): https://www.eera-ecer.de/
        UNESCO International Institute for Educational Planning (IIEP): http://www.iiep.unesco.org/
        Organization for Economic Cooperation and Development (OECD): https://www.oecd.org/education/
        Education Development Center (EDC): https://www.edc.org/
        National Institute for Educational Policy Research (NIER) - Japan: https://www.nier.go.jp/
        Australian Council for Educational Research (ACER): https://www.acer.org/
        Canadian Education Association (CEA): https://www.cea-ace.ca/

    My topic is about stem education. Which grey literature databases should I use?
       STEM Education Research Centre (SERC) Grey Literature Database
       National STEM Education Research Repository (NSERR)
       National Science Digital Library (NSDL)
       STEM Education Resource Center (SERC) GrayLit Network
       STEM Education Grey Literature Library

    See other Comparison Examples under Bing AI.

  • Examples - ZERO SHOT, ONE SHOT & FEW SHOT PROMPTING for literature review
    See the examples on the left panel- How to Fine Tune Prompts

  • Example- using ChatGPT4o to Extract sections from PDF

    I am going to upload a PDF file. Please extract its Research Objectives, Hypotheses,Methodology and findings for me, and then save it into a text file.
    ChatGPT40: Yes, you can upload the PDF..... The uploaded document appears to be a paper titled "......" I'll begin by searching for and extracting the relevant sections. Extracting Sections: Research Objectives and Hypotheses, Methodology, Results and Findings
    Let me start with the extraction. The information has been saved into a text file. You can download it using the link below:Download Extracted_Info.txt

Bing Chat / Copilot

  • Some search examples tested with Bing Chat in the past which may or may be applied.
  • Examples:
  • I need to find some peer reviewed articles about infant mortality?:
    Please find five empirical studies focusing on the factors influencing adult learners engaged in online education.
    Please locate five scholarly articles on stress management among college students site:pubmed.ncbi.nlm.nih.gov

  • Example - COMPARISON of Bing AI and ChatGPT
  • What are the synonyms related to academic achievement if I want to use them with ERIC (EBSCO)?
    ChatGPT
    In ERIC (EBSCO), some synonyms related to academic achievement include: Educational attainment, Learning outcomes, Educational success, Academic progress, Scholastic achievement, Academic performance, Educational achievement, Learning achievement, Academic success, Achievement outcomes.

    Bing AI

    The ERIC Thesaurus contains nearly 12,000 terms, including more than 4,500 descriptors and 7,000 synonyms1. You can use the term “Academic Achievement” or its synonyms such as “Educational Achievement”, “Student Achievement”, “Academic Performance”, “Academic Success”, “Learning Achievement” and more.
    [Note: The follow-up responses include some additional sources. See two below:
    https://www.powerthesaurus.org/academic_achievement
    https://eric.ed.gov/?ti=Achievement+Gap  including https://eric.ed.gov/?ti=Academic+Achievement|

    My topic is about psychology.  Which grey literature databases should I use?

    ChatGPT
    PsycEXTRA: This database is produced by the American Psychological Association (APA) and includes grey literature such as conference papers, white papers, and government reports.
    OpenGrey: This database provides access to grey literature produced in Europe, including reports, theses, and conference papers
    Grey Literature Report: This database is produced by The New York Academy of Medicine and includes reports, conference papers, and other grey literature on health and biomedical topics, including psychology.
    ProQuest Dissertations and Theses Global: This database provides access to graduate-level research, including theses and dissertations related to psychology.
    Google Scholar: While not exclusively a grey literature database, Google Scholar can be a useful tool for finding conference papers and other grey literature related to psychology.
    National Technical Information Service (NTIS): This database provides access to technical reports produced by U.S. government agencies, including those related to psychology.

    Bing AI  - [asked in the Precise mode]
    For psychology, some recommended grey literature databases include PsycEXTRA and PsyArXiv. PsycEXTRA is a database of grey literature material relating to psychology, behavioral sciences and health. PsyArXiv is a psychology archive designed to facilitate rapid dissemination of psychological research.

    What other models have been developed based on SERVQUAL?

    ChatGPT
    SERVPERF: This is a modified version of SERVQUAL that measures service performance instead of service quality. It was developed by Cronin and Taylor (1992) and is based on the same five dimensions as SERVQUAL.
    SERVPERF-GAP: This model, developed by Caruana and Ewing (2006), extends the SERVQUAL model to include three additional gaps that can occur when customers evaluate service quality.
    E-SERVQUAL: This model, developed by Parasuraman et al. (2005), adapts the SERVQUAL scale to measure the quality of online services.
    SERVCO: This is a modified version of SERVQUAL that includes a sixth dimension, "cost," in addition to the five original dimensions. It was developed by Sureshchandar et al. (2002) to address the importance of cost in service quality evaluation.
    SERVQ-HC: This is a modified version of SERVQUAL that is specifically designed for evaluating the quality of health care services. It was developed by Cronin et al. (2000) and includes 20 items across five dimensions.
    Bing AI
    The Nordic model of service quality2.
    The hierarchical model of service quality3.
    The American Customer Satisfaction Index (ACSI)4.

        Example: FINDING SIMILAR ARTICLES
              Find similar articles like this one - [then paste in the abstract ]

  • Examples: SUMMARIZE + REFERENCE
    Use a basic prompt like "Please summarize ...." or add a REFERENCE TEXT that provides the necessary information about what you want it to focus or how the summary or resolution should look like. The reference could be a quote, an excerpt, a partial sentence, a conversation, an abstract, a code snippet and so on. Formula: Specify a role > assign a task > provide a reference text.
    Example#1 - Scholarly Journal Quote

    STEP 1
    Prompt: As a research assistant specializing in renewable energy, your task is to locate comparable research studies based on the provided abstract from an article: "This paper discusses the perspective of renewable energy (wind, solar, wave and biomass) in the making of strategies for a sustainable development. Such strategies typically involve three major technological changes: energy savings on the demand side, efficiency improvements in the energy production, and replacement of fossil fuels by various sources of renewable energy."

    Bing Chat then found 3 documents: the ScienceDirect paper where the abstract came from and two other documents from un.org and frontiersin.org respectively.

    STEP 2
    Prompt: Please synthesize the findings of the three studies provided above.


    Example
    You are an expert researcher, given the following article, please discuss its key findings. Reference: How Large Language Models are Transforming Machine-Paraphrased Plagiarism at https://arxiv.org/abs/2210.03568

    Example
    Please read this article and summarize it for me - "Who Should I Trust: AI or Myself? Leveraging Human and AI Correctness Likelihood to Promote Appropriate Trust in AI-Assisted Decision-Making" and then find 5 similar studies and please also summarize each for me.

  • Examples: Literature Review Matrix 
    Find five articles on a given topic first, and then use a prompt like this one:

    Generate a 3-column table: 1st column list paper title, 2nd column list their research methods and 3rd column list their major findings.



    You can also leverage ChatGPT 3.5 to convert your search results into a review matrix. Begin by searching an academic database and saving the outcomes in a plain text file. Next, extract only the needed information such as titles, authors, and abstracts, and incorporate them into your input prompt. Given ChatGPT's token limit for input, roughly about 1000 words, focus on extracting the necessary details. Should the file size be too large, consider splitting the task into several conversions and subsequently combining them.



     

CITING GENERATIVE AI