Presented by:
Regina Yan Rochester
Key Statement:
Explore the integration of artificial intelligence ( AI) tools in higher education assessment and their influence on syllabus design for improved learning outcomes.
Keywords:
AI Tools, Higher Education Assessment, Syllabus Design
Abstract:
Integrating artificial intelligence (AI) into assessment technologies holds great promise for higher education. AI-powered systems streamline continuous feedback integration, utilizing diverse artifacts. A well-crafted syllabus acts as a course blueprint, shaping structure and enhancing quality. It delineates student and instructor roles throughout the term. This presentation explores the practical use of AI tools in higher education assessment, specifically their impact on syllabus design. Through MAXQDA qualitative analysis, we study various syllabi to reveal AI tools' implications on assessment strategies and curriculum development, this study analyze various syllabi to uncover the implications of AI tools on assessment strategies and curriculum development.
Learning Outcomes:
1. Understand the role of AI tools in higher education assessment.
2. Evaluate the impact of AI tools on syllabus design and student learning.
3. Identify best practices for integrating AI tools into assessment and curriculum planning.
Hear it from the author:
TRANSCRIPT:
Introduction
The integration of artificial intelligence (AI) tools into higher education assessment processes represents a significant advancement in educational technology. AI has the potential to revolutionize the way assessments are conducted, providing educators with valuable insights and improving learning outcomes for students. This poster delves into the practical application of AI tools in higher education assessment, focusing particularly on their influence on syllabus design and their impact on enhancing learning experiences.
Rationale
The use of AI tools in higher education assessment is driven by the need for more efficient and effective evaluation methods. Traditional assessment approaches often struggle to keep pace with the dynamic learning environments of today's students. AI-powered systems offer solutions such as personalized feedback, adaptive testing, and predictive analytics, thus enhancing the overall learning experience. By examining the integration of AI tools in syllabi, this research aims to uncover the potential benefits and challenges associated with this technological shift.
Methodology
This study employs a qualitative analysis methodology using MAXQDA software to analyze a sample of 80 syllabi from various universities and majors across the United States. The selection criteria focus on diversity in disciplines and inclusion of AI-related content. The analysis involves coding and thematically organizing data to identify patterns and trends in the implementation and impact of AI tools in educational settings. Key areas of interest include personalized feedback, adaptive testing, and curriculum development. The qualitative approach allows for an in-depth understanding of the practical implications of AI in higher education.
Analysis
• Data Collection: Collected syllabi from diverse higher education institutions.
• Qualitative Analysis: Utilized MAXQDA for thematic analysis of syllabi content related to AI tools and assessment strategies.
• Figure Creation: Employed Lucidchart to visualize key findings and trends.
Results/Findings
• Streamlined Continuous Feedback: AI-powered systems facilitate more timely and accurate feedback mechanisms compared to traditional assessment methods. This improves student engagement and performance.
• Enhanced Predictive Analytics: AI tools can analyze large datasets to identify learning patterns and predict student outcomes, allowing for more targeted interventions.
• Adaptive Testing: AI enables the creation of adaptive tests that adjust in real-time to a student's ability level, providing a more personalized assessment experience.
• Curriculum Development: AI tools assist in the refinement of curriculum design, providing insights into effective teaching and learning strategies. They help in aligning educational goals with student needs.
Recommendations
1. Implement AI-Driven Feedback: Use AI for personalized feedback and automated grading.
2. Explore New Assessment Strategies: Try adaptive testing and interactive assessments.
3. Use Data Analytics: Utilize AI for insights into student performance and learning patterns.
4. Integrate AI in Curriculum Design: Incorporate AI tools to optimize course content and pedagogy.
5. Provide Faculty Training: Offer training for faculty on AI tools and ethical use.
6. Ensure Transparency and Ethics: Establish clear guidelines for AI use and data privacy.
7. Evaluate and Adapt: Continuously assess AI's impact and adjust strategies accordingly.
Conclusion
In conclusion, the integration of AI tools in higher education assessment holds immense potential for improving learning outcomes. By facilitating efficient feedback mechanisms and contributing to innovative assessment strategies, AI tools pave the way for more effective evaluation methods and curriculum development. The role of well-structured syllabi remains crucial as a guiding blueprint, ensuring clarity of roles and expectations for students and instructors alike. As we continue to explore the evolving landscape of AI technology, ongoing research and exploration are vital to harnessing its full potential and creating positive learning experiences in higher education.
REFERENCES:
Michael S. Palmer, Lindsay B. Wheeler & Itiya Aneece (2016) Does the Document Matter? The Evolving Role of Syllabi in Higher Education, Change: The Magazine of Higher Learning, 48:4, 36-47, DOI: 10.1080/00091383.2016.1198186
Rudolph, J., Tan, S., & Tan, S. (2023). ChatGPT: Bullshit spewer or the end of traditional assessments in higher education? ED-TECH REVIEWS. https://doi.org/10.37074/jalt.2023.6.1.9
Smolin, D., & Butakov, S. (2012). Applying artificial intelligence to the educational data: An example of syllabus quality analysis. In LAK '12: Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (pp. 164–169). https://doi.org/10.1145/2330601.2330644