AIM Analytics Talk Series – Lightning Talk Session

AIM Analytics: U-M Community Lightning Talks

December 10 @ 12:00 pm – 1:30 pm

Blue and yellow bar graph with AIM Analytics text log

Join us on Monday, December 10 from 12:00 p.m. to 1:30 p.m. in North Quad (105 S State St) Space 2435 for AIM Analytics as we invite members of the U-M community to share interesting projects they are working on in relation to learning analytics.

AIM Analytics is a bi-weekly seminar series for researchers across U-M who are interested in learning analytics. The field of learning analytics is a multi and interdisciplinary field that brings together researchers from education, learning sciences, computational sciences and statistics, and all discipline-specific forms of educational inquiry.

 

Slides can be found in here.

Lightning Talks to include:

 

  • Social Comparison in MOOCs: Perceived SES, Opinion, and Message Formality by Heeryung Choi
    • Abstract: There has been limited research on how perceptions of socioeco- nomic status (SES) and opinion difference could influence peer feedback in Massive Open Online Courses (MOOCs). Using social comparison theory [11], we investigated the influence of ability and opinion-related factors on peer feedback text in a data science MOOC. Perceived SES of peers and the formality of written re- sponses were used as the ability-related factor, while agreement between learners represented the opinion-related factor. We focused on understanding the behaviors of those learners who are most prevalent in MOOCs; those from high socioeconomic countries. Through two studies, we found a strong and repeated influence of agreement on affect and formality in feedback to peers. While a mediation effect of perceived SES was found, a significant effect of formality was not. This work contributes to an understanding of how social comparison theory can be operationalized in online peer writing environments.
  • Modeling Gender in Intra and Interpersonal Dynamics during Online Learning Collaborative Interactions by Yiwen Lin
    • Abstract: Evidence from past research has suggested that gender differences in collaborative learning often map onto stereotypical gender expectations. For instance, men use more aggressive language while women appear to be more agreeing and emotional. To explore gender differences in collaborative communication, we employed the methodology of Group Communication Analysis (GCA), which allows us to examine multiple sociocognitive aspects of learner interactions. Counter to some previous findings, we did not find significant differences between men and women in the degree of participation. However, our results suggest that women have significantly higher social impact, responsivity and internal cohesion in small group collaborative environment. Comparing the proportion of learner interaction profiles between men and women further strengthen the evidence that women are more likely to engage in effective discourse. Our findings provide implications for pedagogical practice to increases equity and inclusivity in online collaborative learning.
  • Beyond A/B Testing: Sequential Randomization for Developing Interventions in Scaled Digital Learning Environments by Timothy NeCamp
    • Abstract: Randomized experiments ensure robust causal inference that are critical to effective learning analytics research and practice. How- ever, traditional randomized experiments, like A/B tests, are limiting in large scale digital learning environments. While traditional ex- periments can accurately compare two treatment options, they are less able to inform how to adapt interventions to continually meet learners’ diverse needs. In this work, we introduce a trial design for developing adaptive interventions in scaled digital learning environments – the sequential randomized trial (SRT). With the goal of improving learner experience and developing interventions that benefit all learners at all times, SRTs inform how to sequence, time, and personalize interventions. In this paper, we provide an overview of SRTs, and we illustrate the advantages they hold com- pared to traditional experiments. We describe a novel SRT run in a large scale data science MOOC. The trial results contextualize how learner engagement can be addressed through inclusive culturally targeted reminder emails. We also provide practical advice for researchers who aim to run their own SRTs to develop adaptive interventions in scaled digital learning environments.
  • What Can We Learn About Learner Interaction When One Course is Hosted on Two MOOC Platforms? By Yuanru Tan
    • Abstract: Since the inception and adoption of MOOCs, pedagogues have criticized the quality of social learning within centralized platforms. Learning analytics researchers have investigated patterns of forum use and their relationship to learner performance. Yet, there are currently no cross-platform comparisons that explain how technical features of MOOC platforms may impact social interaction and the formation of learner networks. To address this issue, we analyzed MOOC discussion forum data from a single data science ethics course that ran concurrently on two different MOOC platforms (edX and Coursera). Using Social Network Analysis methods, the study compares networks of active forum posters using “Direct Reply” and “Star” tie definitions. Results show that the platforms afforded formation of different networks, with higher connectedness and higher network centralization seen on edX. The study presents preliminary results, discusses limitations inherent within the current analysis, and sets further directions of research investigating design features of centralized discussion platforms.
  • The Impact of Student Opt-Out on Educational Predictive Models by Warren Li
    • Abstract: Privacy concerns may lead people to opt-in or opt-out of having their educational data collected. These decisions may impact the performance of educational predictive models. To understand this, we conducted a survey to determine the propensity of students to withhold or grant access to their data for the purposes of training predictive models. We simulated the effects of opt-out on the accuracy of educational predictive models by dropping a random sample of data over a range of increments, and then contextualize our findings using the survey results. We find that grade predictive models are fairly robust and that kappa scores do not decrease unless there is significant opt-out, but when there is, the deteriorating performance disproportionately affects certain subsamples of the population.

 

Advertisement

AIM Analytics: U-M Community Presentations

http://ai.umich.edu/event/aim-analytics-u-m-community-presentations/

Join us on Monday, December 4 from 12:00 to 1:30 p.m. in the Hatcher Gallery of the Hatcher Graduate Library (913 S. University Ave.) for AIM Analytics.

AIM Analytics was created to bridge the gaps in the support of UM learning analytics researchers with respect to the building of technical skills, sharing knowledge of educational datasets, and facilitating collaborative investigations.

For this event, we welcome members of the U-M community to share “late breaking work” within their departments.

Presentations will include:

  • Measuring the Pros and Cons of a Blended Course by Perry SamsonArthur F Thurnau Professor, Professor of Climate and Space Sciences and Engineering, College of Engineering and Professor of Information, School of Information
  • Sentiment Analysis of Student Evaluations, and (separately) the Impact of Peer Feedback/Grades on TA Feedback/Grades by Heather NewmanDirector of Marketing and Communications, School of Information
  • The U-M Learning Analytics Architecture (LARC) Dataset: What is it, How to Access it, and How it Enables LA Research by Steve LonnDirector of Enrollment Research and Data Management
  • Predicting Short- and Long-Term Vocabulary Learning via Semantic Features of Partial Word Knowledge by SungJin NamGraduate Student Research Assistant, School of Information
  • Social Comparison Theory as Applied to MOOC Student Writing: Constructs for Opinion and Ability by Heeryung Choi, Graduate Student Research Assistant, School of Information
  • Scale MOOC Discourse Analysis with In Situ Coding by Phoebe Liang, Graduate Student Research Assistant, School of Information
 

For a description of each of the presentations, please click here.

 

AIM Analytics: The Ethics, Policy, Privacy, and Law of Research with Educational Data

Panel Discussion –  Accessing Educational Datasets at Michigan: Privacy, Policy, Security, Legal, and Ethical Considerations and Responsibilities

AIM Analytics was created to bridge the gaps in the support of UM learning analytics researchers with respect to the building of technical skills, sharing knowledge of educational datasets, and facilitating collaborative investigations.

For this discussion, we welcome a panel of experts from the University of Michigan to share their knowledge and experience in understanding how to access and responsibly use educational data at U-M. Suitable for all faculty, postdocs, researchers and students who are looking to use educational data, this panel will provide insight into the “how,” “who” and “why” of educational data at U-M. The panel discussion will be followed by a Q&A session.

 

Panelists to include:

Sol BermannInterim Chief Information Security Officer, U-M Information and Technology Services

Maya Kobersy, Associate General Counsel, U-M Office of the General Counsel

Mike Daniel, Director of Policy and Operations, U-M Office of Academic Innovation

Cynthia Shindledecker, Director, U-M Health and Behavioral Sciences Institutional Review Board

 

Some of the questions that will be discussed include:

1) How will the changes in human subjects regulations impact Institutional Review Board (IRB) review of learning analytics research?

2) What does Family Educational Rights and Privacy Act (FERPA) mean to the researcher, and how does the research ensure their work complies with U-M FERPA requirements?

3) Who are the data stewards, and how do you find the right person to ask for educational data?

4) What are best practices for de-identifying data? What is the difference between de-identifying data and anonymizing data?

5) What privacy and ethical considerations and best practices should I be thinking about?

6) What data security practices do I need to follow and/or should I consider?

 

AY 2018 kickoff meeting

Welcome back, everyone!


The first AIM Analytics session is consist of an introduction to events that will happen in this academic year, series of micro talks, and a community discussion. The detailed agenda is below:
  • Intro to series
  • Introducing our new reading/hacking group
    • ASSISTments Data Mining Competition
  • Micro talks
    • Benjamin Koester: Measuring Effects and outcomes of Learning Communities at UMich
    • Rohail Syed: Retrieval Algorithms Optimized for Human Learning
    • Josh Gardner: Building, Evaluating, and Replicating MOOC Dropout Models
    • Phoebe Liang: Best Answer Prediction in MOOC Discussion Forums 
    • Heeryung Choi: Understanding diversity attributes in students : From learner diversity to different opinions
    • Nia Dowell: A temporal lens: Understanding changes of MOOC learners
    • Carl Haynes: How am I doing? : Student-Facing Performance Dashboards in Higher Education
    • Rebecca Quintana:  Visualizing course structure: Just bead it!
  • Socialization and discussion

Learning Analytics: its emergence, trends, and systemic impact – Slide from George Siemens

Learning analytics as an academic research space has been growing in influence for nearly a decade. Campuses globally are deploying learning analytics to address a range of challenges including student dropout, poor engagement and targeted marketing as well as predict teaching and resource needs. As a field, learning analytics has advanced rapidly both as a research domain and as a practical on-campus activity to increase organizational use of data. In this presentation, Dr. George Siemens will explore both the research and the practice of analytics in education, focusing on the development of the Society for Learning Analytics, models for research and organizational data use and growing sophistication of data collection through psychophysiological approaches.

Dr. George Siemens researches networks, analytics, wellbeing and openness in education. Dr. Siemens is Professor and Executive Director of the Learning Innovation and Networked Knowledge Research Lab at University of Texas, Arlington and cross-appointed with the Centre for Distance Education at Athabasca University. He has delivered keynote addresses in more than 35 countries on the influence of technology and media on education, organizations and society. His work has been profiled in provincial, national and international newspapers (including The New York Times), radio and television. He has served as Principal Investigator or Co-Principal Investigator on grants totaling more than $15 million, with funding from the National Science Foundation, Social Sciences and Humanities Research Council (Canada), Intel, Bill & Melinda Gates Foundation, Boeing, and the Soros Foundation. He has served as a collaborator on international grants in European Union, Australia, Senegal, Ghana, and United Kingdon. He has received numerous awards, including honorary doctorates from Universidad de San Martín de Porres and Fraser Valley University for his pioneering work in learning, technology and networks. He holds an honorary professorship with University of Edinburgh and adjunct status with University of South Australia.

Dr. Siemens is a founding President of the Society for Learning Analytics Research. He has advised government agencies in Australia, European Union, Canada and United States, as well as numerous international universities, on digital learning and utilizing learning analytics for assessing and evaluating productivity gains in the education sector and improving learner results. In 2008, he pioneered massive open online courses (MOOCs). He blogs at http://www.elearnspace.org/blog/ and on Twitter (@gsiemens).

 

Accessing Educational Datasets at Michigan – Slides from the panel discussion

Panel: Accessing educational datasets at Michigan: privacy, policy, security, legal, and ethical considerations and responsibilities.

Join us on November 7th at 12 p.m. in the Hatcher Gallery Lab for a panel and Q&A with Maya Kobersy (U-M Associate General Counsel),  Sol Bermann (University Privacy Officer), Cindy Shindledecker (IRB Director) and Mike Daniel (Director of Policy for Academic Innovation) to understand how to access and responsibly use educational data at the University of Michigan. Suitable for all faculty, postdocs, researchers and students who are looking to use educational data, this panel will provide insight into the “how,” “who” and “why” of educational data at U-M and plenty of time will be left to ask questions of these experts. A light lunch is provided.

Questions covered include:

1) How does an exemption determination from the Internal Review Board (IRB) for research involving “normal educational practices” differ from a standard IRB approval?

2) What does Family Educational Rights and Privacy Act (FERPA) mean to the researcher, and how does the research ensure their work complies with U-M FERPA requirements?

3) Who are the data stewards, and how do you find the right person to ask for educational data?

4) What are best practices for de-identifying data? What is the difference between de-identifying data and anonymizing data?

5) What privacy and ethical considerations and best practices should I be thinking about?

6) What data security practices do I need to follow and/or should I consider?

Please RSVP here

 

AIM Analytics talk series – Nia Dowell

Nia Dowell from the University of Memphis will provide a talk on October 24, Monday, at 12:00-1:30 pm in 2435 North Quad.

 

If you are planning to attend, please RSVP. More details are listed below.


nia

Title: Group communication analysis: A computational-linguistic framework for exploring conversational roles in online multi-party communication

 

Abstract:

This talk will present results from recent work that uses language to assess social dynamics during collaborative interactions. I will introduce group communication analysis (GCA), a novel approach for detecting emergent learner roles from the participants’ contributions and patterns of interaction. This method makes use of automated computational linguistic analysis of the sequential interactions of participants in online group communication to create distinct interaction profiles. We have applied the GCA to several collaborative learning datasets. Cluster analysis, predictive, and hierarchical linear mixed-effects modeling were used to assess the validity of the GCA approach, and practical influence of learner roles on student and overall group performance. The results indicate that learners’ patterns in linguistic coordination and cohesion are representative of the roles that individuals play in collaborative discussions. More broadly, GCA provides a framework for researchers to explore the micro intra- and inter-personal patterns associated with the participants’ roles and the sociocognitive processes related to successful collaboration.

Bio: Nia Dowell is a cognitive psychology doctoral candidate at the Institute for Intelligent Systems in the University of Memphis. Nia is currently pursuing her PhD under the mentorship of Professor Arthur Graesser. Her primary interests are in cognitive psychology, discourse processing and learning sciences. In general, her research focuses on using language and discourse to uncover the dynamics of socially significant, cognitive, and affective processes. She is currently applying computational techniques to model discourse and social dynamics in a variety of learning environments including teacher education programs, intelligent tutoring systems (ITSs), small group computer-mediated collaborative learning environments, and massive open online courses (MOOCs). Her research has also extended beyond the educational and learning sciences spaces and highlighted the practical applications of computational discourse science in the clinical, political and social sciences areas.

Continue reading AIM Analytics talk series – Nia Dowell

AIM Analytics talk series – Vitomir Kovanovic

Vitomir Kovanovic from University of Edinburgh will provide a talk on Octover 10, Monday, at 12:00-1:30 pm in the Hatcher Gallery Lab.
If you are planning to attend, please RSVP. More details are listed below.

Title:

A Novel Model of Cognitive Presence Assessment Using Automated Learning Analytics Methods.

Abstract:

One of the significant trends in educational is the increased interest in the development of students’ critical and deep thinking skills. Besides creativity, collaboration, and communication, critical thinking has been recognized as one of the core 21st-century skills necessary to work in the globalized economy. One of the widely used approaches for the development of critical thinking skills is inquiry-based learning, which — instead of presenting facts and information in a smooth learning path — begins with a question, problem, or scenario, and students build knowledge through interaction with the learning content and other students. In the context of online learning, the Community of Inquiry (CoI) model is a widely used pedagogical framework that outlines the constructs that shape students’ overall learning experience, including cognitive presence, which captures the development of students’ critical and deep thinking skills. Although cognitive presence has been recognized as important in student learning outcomes, assessing it is challenging, primarily because of its latent nature and the physical constraints of online learning settings. However, the vast amounts of data collected by the learning systems provide opportunities to assess student levels of cognitive presence through automated data analytics techniques, often referred to as learning analytics. In this presentation, we will overview a framework for the formative assessment of student cognitive presence based on learning analytics methods.  Those include automatic analysis of student discussions and profiling of students based on their use of the available learning systems. With the goal of providing both instructors and students with timely and actionable feedback, the developed tools also enable to better understand the overall complexity of learning, thus advancing both the practice and theory of online learning.

Bio:

Vitomir Kovanović is a Ph.D. student and research assistant at School of Informatics, University of Edinburgh, United Kingdom, and a research assistant at the Learning Innovation and Networked Knowledge Research Lab at the University of Texas, Arlington. Vitomir’s research is in the area of Learning Analytics and Educational Data Mining focuses on the development of novel learning analytics methods based on the trace data collected by learning management systems and their use to improve inquiry-based online education. He authored and co-authored several book chapters, conference papers, and journal articles. Vitomir is a recipient of two best paper awards (LAK15 and HERDSA15 conferences) and scholarships by the Serbian Ministry of Education, Simon Fraser University, and the University of Edinburgh.

AIM Analytics talk series – Marco Molinaro

Marco Molinaro from UC Davis will provide a talk on September 12, Monday, at 12:00-1:30 pm in the Hatcher Gallery.
If you are planning to attend, please RSVP.  More details are listed below.

molinaro

Marco Molinaro, Ph.D., is the Assistant Vice Provost for Educational Effectiveness at UC Davis where he oversees the Center for Educational Effectiveness which includes learning and teaching support, instructional research and development and educational analytics. Dr. Molinaro has over 20 years of educational experience creating and leading applications of technology for instruction, scientific visualization and simulation, tools for evidence-based instructional actions, curriculum development and evaluation, and science exhibits for students from elementary school through graduate school and for the general public.

Currently, Molinaro is leading the UC Davis university-wide effort to improve undergraduate student success through the Center for Educational Effectiveness (an expansion of the former iAMSTEM Hub merged with the prior Center for Excellence in Teaching and Learning.) As part of the effort, the Center is working with faculty and staff across the university to: 1) improve and evolve the introductory undergraduate curriculum, 2) understand and measure change with new analytics tools and approaches that guide instructional improvement and, 3) develop actionable student success models.

Molinaro is also the founder of the Tools for Evidence-based actions community, a group of researchers and administrators from over 70 universities dedicated to sharing tools and methodologies that encourage evidence-based instructional actions. His projects have been funded through the NSF, NIH and various private foundations such as Gates, Intel and the Helmsley Trust.