Episodi

  • Is AI the Future of Learning or the Death of Education?
    Jan 6 2025

    AI hallucinations, or confabulations, can actually foster scientific innovation by generating a wealth of ideas, even if many of them are incorrect. Craig Van Slyke and Robert E. Crossler explore how AI's ability to rapidly process information allows researchers to brainstorm and ideate more effectively, ultimately leading to significant breakthroughs in various fields. They discuss the need for a shift in how we train scientists, emphasizing critical thinking and the ability to assess AI-generated content. The conversation also touches on the potential risks of AI in education, including the challenge of maintaining student engagement and the fear of students using AI to cheat. As they dive into the latest tools like Google's Gemini and NotebookLM, the hosts highlight the importance of adapting teaching methods to leverage AI's capabilities while ensuring students develop essential skills to thrive in an AI-augmented world.

    The latest podcast episode features an engaging discussion between Craig Van Slyke and Robert E. Crossler about the impact of AI on innovation and education. They dive into the concept of AI hallucinations and confabulations, noting that while these outputs may be inaccurate, they can spark creative thinking and lead to valuable scientific breakthroughs. Crossler emphasizes that trained scientists can sift through these AI-generated ideas, helping to separate the wheat from the chaff. This perspective reframes the way we view AI's role in generating new knowledge and highlights the importance of human expertise in guiding this process.

    As the dialogue progresses, the hosts address the implications of AI on educational practices. They express concern about the reliance on self-directed learning, noting that many students struggle to engage deeply without structured support. Van Slyke and Crossler advocate for a reimagined educational framework that incorporates AI tools, encouraging educators to foster critical thinking and analytical skills. By challenging students to interact with AI outputs actively, such as critiquing AI-generated reports or creating quizzes based on their work, instructors can ensure that learning is meaningful and substantive.

    The episode also explores practical applications of AI tools like Google’s Gemini and NotebookLM for enhancing educational experiences. They discuss how these tools can facilitate research and content creation, making it easier for students to engage with complex topics. However, they also acknowledge the potential for misuse, such as cheating. The hosts argue that by redesigning assignments to focus on critical engagement with AI-generated content, educators can mitigate these risks while enriching the learning process. In summary, the episode provides a thought-provoking examination of how AI can both challenge and enhance the educational landscape, urging educators to adapt their approaches to prepare students for a future where AI is an integral part of knowledge acquisition.

    Takeaways:

    • AI hallucinations, referred to as confabulations, can stimulate scientific innovation by generating diverse ideas.
    • The rapid consumption of information by AI accelerates connections that human scientists might miss.
    • Future scientists must adapt their training to critically assess AI-generated confabulations for practical use.
    • Education needs to evolve to help students engage with AI as a tool for learning.
    • Using AI tools in the classroom can enhance critical thinking skills and analytical abilities.
    • Collaboration among educators is essential to share effective strategies for utilizing AI technologies.

    Links

    1. New York Times article: https://www.nytimes.com/2024/12/23/science/ai-hallucinations-science.html

    2. Poe.com voice...

    Mostra di più Mostra meno
    41 min
  • Navigating the AI Landscape: Essential Tools for Higher Education Professionals
    Dec 2 2024

    This episode of AI Goes to College discuss the practical applications of generative AI tools in academic research, focusing on how they can enhance the research process for higher education professionals. Hosts Craig Van Slyke and Robert E. Crossler discuss three key tools: Connected Papers, Research Rabbit, and Scite_, highlighting their functionalities and the importance of transparency in their use. They emphasize the need for human oversight in research, cautioning against over-reliance on AI-generated content, as it may lack the critical thought necessary for rigorous academic work. The conversation also touches on the emerging tool NotebookLM, which allows users to query research articles and create study guides, while raising ethical concerns about data usage and bias in AI outputs. Ultimately, Craig and Rob encourage listeners to explore these tools thoughtfully and integrate them into their research practices while maintaining a critical perspective on the information they generate.

    ---

    The integration of generative AI tools into academic research is an evolving topic that Craig and Rob approach with both enthusiasm and caution. Their conversation centers around a recent Brown Bag series at Washington State University, where Rob's doctoral students showcased innovative AI tools designed to assist in academic research. The discussion focuses on three tools in particular: Connected Papers, Research Rabbit, and Scite_. Connected Papers stands out for its transparency, utilizing data from Semantic Scholar to create a visual map of related research, which aids users in finding relevant literature. This tool allows researchers to gauge the interconnectedness of papers and prioritize their reading based on citation frequency and relevance.

    In contrast, Research Rabbit's lack of clarity regarding its data sources and the meaning of its visual representations raises significant concerns about its reliability. Rob's critical assessment of Research Rabbit serves as a cautionary tale for researchers who might be tempted to rely solely on AI for literature discovery. He argues that while tools like Research Rabbit can provide useful starting points, they often fall short of the rigorous standards required for academic research. The hosts also discuss Cite, which generates literature reviews based on user input. Although Cite can save time for researchers, both Craig and Rob emphasize the necessity of critical engagement with the content, warning against over-reliance on AI-generated summaries that may lack depth and nuance.

    Throughout the episode, the overarching message is clear: while generative AI can enhance research efficiency, it cannot replace the need for critical thinking and human discernment in the research process. Craig and Rob encourage their listeners to embrace these tools as aides rather than crutches, fostering a mindset of skepticism and inquiry. They underscore the importance of maintaining academic integrity in the face of rapidly advancing technology, reminding researchers that their insights and interpretations are invaluable in shaping the future of scholarship. By the end of the episode, listeners are equipped with practical advice on how to navigate the intersection of AI and research, ensuring that they harness the power of these tools responsibly and effectively.

    Takeaways:

    • Generative AI tools can help streamline academic research but should not replace critical thinking.
    • Connected Papers offers transparency in sourcing research papers, unlike some other tools.
    • Students must remain skeptical of AI outputs, ensuring they apply critical thought in research.
    • Tools like NotebookLM can assist in summarizing and querying research articles effectively.
    • Using AI can eliminate busy work, allowing researchers to focus on adding unique insights.
    • Educators need to guide students on how to leverage AI tools...
    Mostra di più Mostra meno
    40 min
  • AI detectors, amazing slides with Beautiful AI and Gemini as an AI gateway
    Nov 18 2024

    Generative AI is reshaping the landscape of higher education, but the introduction of AI detectors has raised significant concerns among educators. Craig Van Slyke and Robert E. Crosler delve into the limitations and biases of these tools, arguing they can unfairly penalize innocent students, particularly non-native English speakers. With evidence from their own experiences, they assert that relying solely on AI detection tools is misguided and encourages educators to focus more on the quality of student work rather than the potential use of generative AI. The conversation also highlights the need for context and understanding in assignment design, suggesting that assignments should be tailored to class discussions to ensure students engage meaningfully with the material. As generative AI tools become increasingly integrated into everyday writing aids like Grammarly, the lines blur between acceptable assistance and academic dishonesty, making it crucial for educators to adapt their approaches to assessment and feedback.

    In addition to discussing the challenges posed by AI detectors, the hosts introduce Beautiful AI, a powerful slide deck creation tool that leverages generative AI to produce visually stunning presentations. Craig shares his experiences with Beautiful AI, noting its ability to generate compelling slides that enhance the quality of presentations without requiring extensive editing. This tool represents a shift in how educators can approach presentations, allowing for a more design-focused experience that can save significant time. The episode encourages educators to explore such tools that can streamline their workflows and improve the quality of their output, ultimately promoting a more effective use of technology in educational settings. The discussion culminates with a call for educators to embrace generative AI not as a threat but as a resource that can enhance learning and teaching practices.

    Takeaways:

    • AI detectors are currently unreliable and can unfairly penalize innocent students. It's essential to critically evaluate their results rather than accept them blindly.
    • The biases in AI detectors often target non-native English speakers, leading to unfair accusations of cheating.
    • Generative AI tools can enhance the quality of writing and presentations, making them more visually appealing and easier to create.
    • Beautiful AI can generate visually stunning slide decks quickly, saving time while maintaining quality.
    • Using tools like Gemini can significantly streamline the process of finding accurate information online, offering a more efficient alternative to traditional searches.
    • Educators should contextualize assignments to encourage originality and understanding, rather than relying solely on AI detection tools.

    Links referenced in this episode:

    • gemini.google.com
    • beautiful.ai


    Companies mentioned in this episode:

    • Grammarly
    • Shutterstock
    • Beautiful AI
    • Google
    • Wright State University
    • WSU
    • Gemini

    Mentioned in this episode:

    AI Goes to College Newsletter

    Mostra di più Mostra meno
    29 min
  • Google NotebookLM and Our AI Toolkits
    Oct 22 2024

    Craig and Rob dig into the innovative features of Google's Notebook LM, a tool that allows users to upload documents and generate responses based on that content. They discuss how this tool has been particularly beneficial in an academic setting, enhancing students' confidence in their understanding of course materials. The conversation also highlights the importance of using generative AI as a supplement to learning rather than a replacement, emphasizing the need for critical engagement with the technology. Additionally, they share their personal AI toolkits, exploring various tools like Copilot, ChatGPT, and Claude, each with unique strengths for different tasks. The episode wraps up with a look at specialized tools such as Lex, Consensus, and Perplexity AI, encouraging listeners to experiment with these technologies to improve their efficiency and effectiveness in academic and professional environments.

    Highlights:

    • 00:17 - Exploring Google's Notebook LM
    • 01:25 - Rob's Experience with Notebook LM in Education
    • 02:05 - The Impact of Notebook LM on Student Learning
    • 04:00 - Creating Podcasts with Notebook LM
    • 05:35 - Generative AI and Student Engagement
    • 11:03 - Personal AI Toolkits: What's in Use?
    • 11:10 - Comparing Copilot and ChatGPT/Claude
    • 06:00 - The Unpredictability of AI Responses
    • 09:35 - Innovative Uses of Generative AI
    • 26:55 - Specialized AI Tools: Perplexity and Consensus
    • 37:22 - Conclusion and Encouragement to Explore AI Tools

    Products and websites mentioned

    Google Notebook LM: https://notebooklm.google.com/

    Perplexity.ai: https://www.perplexity.ai/

    Consensus.app: https://consensus.app/search/

    Lex.page: https://lex.page/

    Craig's AI Goes to College Substack: https://aigoestocollege.substack.com/

    Mentioned in this episode:

    AI Goes to College Newsletter

    Mostra di più Mostra meno
    39 min
  • Leveraging Copilot and Claude to increase productivity in higher ed
    Oct 1 2024

    This episode of AI Goes to College explores the transformative role of generative AI in higher education, with a particular focus on Microsoft's Copilot and its application in streamlining administrative tasks. Dr. Craig Van Slyke and Dr. Robert E. Crossler share their personal experiences, highlighting how AI tools like Copilot can significantly reduce the time spent on routine emails, agenda creation, and recommendation letters. They emphasize the importance of integrating AI tools into one's workflow to enhance productivity and the value of transparency when using AI-generated content. The episode also explores the broader implications of AI adoption in educational institutions, noting the challenges of choosing the right tools while considering privacy and intellectual property concerns. Additionally, the hosts discuss the innovative potential of AI in transforming pedagogical approaches and the importance of students showcasing their AI skills during job interviews to gain a competitive edge.

    In this insightful discussion, Dr. Craig van Slyke and Dr. Robert E. Crossler explored the transformative potential of generative AI in higher education. Drawing from their extensive experience, they examined how Microsoft's Copilot can alleviate the administrative burdens faced by educators. Dr. Crossler shared his firsthand experience with Copilot's ability to draft emails and create meeting agendas, highlighting the significant time savings and productivity gains for academic professionals. This practical use of AI allows educators to redirect their efforts towards more meaningful tasks such as curriculum development and student engagement.

    The hosts also addressed the information overload surrounding AI advancements, advising educators to focus on tools that offer tangible benefits rather than getting caught up in the hype. They discussed the strategic decisions universities face in selecting AI technologies, emphasizing the need for thoughtful integration to maximize educational impact. This conversation underscored the necessity for higher education institutions to remain agile and informed as they navigate the evolving landscape of AI technologies.

    Further, the episode examined AI tools like Claude and Gemini, showcasing their potential to enhance both academic and personal productivity. Claude's artifact feature was highlighted for its ability to organize AI-generated content, providing a structured approach to integrating AI solutions in educational tasks. Meanwhile, Gemini's prowess in tech support and everyday problem-solving was noted as a testament to AI's versatility. The hosts concluded with advice for students entering the job market, encouraging them to leverage their AI skills to gain a competitive edge in their careers.

    Takeaways:

    • Generative AI tools can substantially reduce the time spent on routine tasks like email writing.
    • Higher education professionals can leverage AI for tasks such as creating meeting agendas and recommendations.
    • Using AI requires a shift in how tasks are approached, focusing more on content creation.
    • Schools may need to decide which AI tools to support based on their specific needs.
    • AI tools like Microsoft Copilot can assist in writing by offering different styles and tones.
    • Experimentation with AI in professional settings can lead to significant productivity improvements.

    The AI Goes to College podcast is a companion to the AI Goes to College newsletter (https://aigoestocollege.substack.com/). Both are available at https://www.aigoestocollege.com/.

    Do you have comments on this episode or topics that you'd like us to cover? Email Craig at craig@AIGoesToCollege.com. You can

    Mostra di più Mostra meno
    54 min
  • Is ChatGPT Bull ... and How to Improve Communication with AI
    Jul 29 2024

    Is ChatGPT bull ...? Maybe not.

    In this episode Rob and Craig talk about how generative AI can be used to improve communication, give their opinions of a recent article claiming that ChatGPT is bull$hit, and discuss why you need an AI policy.

    Key Takeaways:

    • AI can be used to improve written communication, but not if you just ask AI to crank out the message. You have to work WITH AI. Rob gives an interesting example of how AI was used to write a difficult message. The key is to co-produce with AI, which results in better outcomes than if either the human or the AI worked alone.
    • Is ChatGPT Bull$hit? A recent article in Ethics and Information Technology claims that ChatGPT (and generative AI more generally) is bull$hit. Craig and Rob aren't so sure, although the authors make some reasonable points.
    • You need an AI policy, even if your college doesn't have one yet. Not only does a policy help you manage risk, a clear policy is necessary to help students understand what is, and is not acceptable. Otherwise, students are flying blind.

    Hicks, M.T., Humphries, J. & Slater, J. (2024). ChatGPT is bullshit. Ethics and Information Technology, 26(38). https://doi.org/10.1007/s10676-024-09775-5 https://link.springer.com/article/10.1007/s10676-024-09775-5

    Mentioned in this episode:

    AI Goes to College Newsletter

    Mostra di più Mostra meno
    34 min
  • AI in higher ed: Is it time to rethink grading?
    Jul 15 2024

    In this episode of AI Goes to College, Craig and Rob dig into the transformative impact of artificial intelligence on higher education. They explore three critical areas where AI is reshaping the academic landscape, offering valuable perspectives for educators, administrators, and students alike.

    The episode kicks off with a thoughtful discussion on helping students embrace a long-term view of learning in an era where AI tools make short-term solutions readily available. Craig and Rob tackle the challenges of detecting AI-assisted cheating and propose innovative approaches to course design and assessment. They emphasize the importance of aligning learning objectives with real-world skills and knowledge retention, rather than focusing solely on grades or easily automated tasks. At the end of it all, they wonder if it's time to rethink grading.

    Next, the hosts examine recent developments in language models, highlighting the remarkable advancements in speed and capabilities available in Anthropic’s new model, Claude 3.5 Sonnet. They introduce listeners to new features like "artifacts" that enhance user experience and discuss the potential impacts on various academic disciplines, particularly in programming education and research methodologies. This segment offers a balanced view of the exciting possibilities and the ethical considerations surrounding these powerful tools.

    The final portion of the episode covers issues related to the complex world of copyright issues related to AI-generated content. Craig and Rob break down the ongoing debate around web scraping practices for AI training data and explore the potential legal and ethical implications for AI users in academic settings. They stress the importance of critical thinking when utilizing AI tools and provide practical advice for educators and students on responsible AI use.

    Throughout the episode, the hosts share personal insights, anecdotes from their teaching experiences, and references to current research and industry developments. They maintain a forward-thinking yet grounded approach, acknowledging the uncertainties in this rapidly evolving field while offering actionable strategies for navigating the AI revolution in higher education.

    This episode is essential listening for anyone involved in or interested in the future of education. It equips listeners with the knowledge and perspectives needed to adapt to and thrive in an AI-enhanced academic environment. Craig and Rob's engaging dialogue not only informs but also inspires listeners to actively participate in shaping the future of education in the age of AI.

    Whether you're a seasoned educator, a curious student, or an education technology enthusiast, this episode of AI Goes to College provides valuable insights and sparks important conversations about the intersection of AI and higher education.

    Mentioned in this episode:

    AI Goes to College Newsletter

    Mostra di più Mostra meno
    45 min
  • Encouraging ethical use, AI friction and why you might be the problem
    Jul 1 2024

    We're in an odd situation with AI. Many ethical students are afraid to use it and unethical students use it ... unethically. Rob and Craig discuss this dilemma and what we can do about it.

    They also cover the concept of AI friction and how Apple's recent moves will address this under appreciated barrier to AI use.

    Other topics include:

    • Which AI chatbot is "best" at the moment
    • Using AI to supplement you, not replace you
    • Why you might be using AI wrong
    • Active learning with AI,
    • and more!

    ---

    The AI Goes to College podcast is a companion to the AI Goes to College newsletter (https://aigoestocollege.substack.com/). Both are available at https://www.aigoestocollege.com/.

    Do you have comments on this episode or topics that you'd like us to cover? Email Craig at craig@AIGoesToCollege.com. You can also leave a comment at https://www.aigoestocollege.com/.

    Mostra di più Mostra meno
    39 min