Summary / background

  • How you do in a specific intro course correlates with subsequent success
  • Perhaps gen eds build skills and reflect capabilities and future success
  • Results of data analysis used to increase advising staff which increased student success
  • Use all your activities around campus to monitor how you are doing

Ethics

  • Do students know data is being collected? how it is being used?
  • Security of data systems? Could data be accidently released?
  • How are results used?
    • Are results used to fix an issue and help, or make the university look good?
  • Should data be analyzed and used?
  • How much should we be tracking and using real-time data?

Additional ideas from Sakai responses

  • A positive predictor doesn’t guarantee a result for a single student
  • Sometimes the results don’t match our personal observations, possibly because individuals don’t conform to averages.
  • But we should keep in mind that anecdotes are small samples compared to big data analytics
  • The underlying reasons for not doing well in a “predictor” course might be complicated. Does the reason for a pattern matter? It might influence the response.
  • Assuming students are informed they are being “tracked”, how are security / privacy concerns handled?
  • Does social belonging really matter? What’s the evidence?
  • What about the potential for use of data that is not purely beneficial for students?
  • Sometimes analystics results don’t match intuition
  • Do analytics and intervention prevent human decision making?
  • How results are used can be be beneficial or harmful
  • Even with all the data, it’s the human interaction that has an impact