Skip to content

ELI 2015 Conference Notes

Highlights from the ELI 2015 conference in Anaheim, CA (besides the 75 degree weather).

blendkitFrom my alma mater, the University of Central Florida. This mooc/resource is for helping faculty and institutions created blended learning courses. From the website: The goal of the BlendKit Course is to provide assistance in designing and developing your blended learning course via a consideration of key issues related to blended learning and practical step-by-step guidance in helping you produce actual materials for your blended course (i.e., from design documents through creating content pages to peer review feedback at your own institution).

The-Symbiotic-Research-ToolkitA research toolkit for students from Georgia University. The idea being that students don’t always know how to use the internet as a resource for research. Might be a good resource in the CTL.

 

 

 

trophyNot Everyone Gets a Trophy – Mark De Vinck, Dexter F. Baker Professor of Practice in Creativity, Lehigh University

Outcomes: Understand the importance of creativity as it relates to innovation, understand the value of hand-on learning, learn how to teach failure without failing.

This faculty member runs a maker lab at the university and provides structured lessons to help students overcome failure. Teaching them to be persistent and resilient in the face of failures. Each student keeps a inventors notebook which documents their tries and processes as well as ideas around real hands-on work on problems. He’s found the most useful boost for innovation is the creation of a safe space for students to explore all ideas and to approach obstacles as opportunities for learning. Claims that all is needed for a maker space is a couch, a popcorn maker and coffee. De Vinck talks about using systematic creativity (nicely defined by Mindfolio here) and the 6 hats of creativity defined as follow (found in wikipedia) :

Six distinct directions are identified and assigned a color. The six directions are:

  • Managing Blue – what is the subject? what are we thinking about? what is the goal?
  • Information White – considering purely what information is available, what are the facts?
  • Emotions Red – intuitive or instinctive gut reactions or statements of emotional feeling (but not any justification)
  • Discernment Black – logic applied to identifying reasons to be cautious and conservative
  • Optimistic response Yellow – logic applied to identifying benefits, seeking harmony
  • Creativity Green – statements of provocation and investigation, seeing where a thought goes

Take aways: I enjoyed this talk and the enthusiasm of the professor. Wondering if we could incorporate this type of problem solving to humanities courses. These innovative and maker space work well in engineering and other sciences. But maybe in Public Humanities, Public Health? Perhaps where working as a group to discover underlying concepts in a discipline might benefit from using a systematic approach to thinking innovatively.

harvardcrestLearning at Scale and the Science of Learning – Justin Reich, Richard L. Menschel HarvardX Research Fellow, Harvard University

Outcomes: Learn how to distinguish between participation data (which is abundant) and learning data (which is scarce), learn about taxonomy of current research approaches ranging from fishing in the data exhaust to design research in the core, understand the importance of randomized experiments (A/B testing) to advancing the science of learning.

At the last ELI conference in New Orleans in 2014, MOOCs were in high profile. They were reaching a peak of interest and a flurry of activity. Now that we’ve got a body work done, we seem to be entering a time of deep assessment of the outcomes. Reich has written a few white papers about his research of HarvardX (one found here). Just like there is diversity in learning experiences there is also diversity of goals. Diversity is central to understanding of the enterprise. There is a difference between measures of activity and measures of learning – there’s a lot of data about what people click on but not what goes on in their heads. The question is: What kinds of things are students doing that are helping learning outcomes?

Reich believes we should reboot MOOC research by offering suggestions for how we might do more research on the learning rather than the engagement (clicks).

Improving structures for assessment

  1. measure full range of competencies
  2. measure change in competency over time
  3. borrow validated assessments from elsewhere

MOOCS research has the following options at this time:

  1. fishing in the exhaust (tracking in the clicking data)
  2. experiments in the periphery – domain independent (don’t have anything to do with the disciple being taught i.e. learning styles or course design options) which means you can plop it into different domains (disciplines)
  3. design research in the core – helps explain how to get students past a barrier or threshold and how to help students learn core concepts in a course better

Take aways: I thought this talk highlighted just how hard it is to create meaningful assessments of learning in an area that has such a diverse set of students. It would seem to me that assessment might be based on the goals of the groups who want information about how MOOCs are doing. Faculty would probably be interested in assessing if students are understanding the core concepts of a course, administration might be more interested in enrollment numbers and completion rates (perhaps the amount of clicking), students are probably interested in ease of use and the quality of the material – this group is probably the hardest to understand in terms of what their goals are, it’s a broad group over many lands.

Hope-Creek-Sunset

Frontiers of Open Data Science Research – Stephen Howe, Director of Technical Product Management-Analytics, McGraw-Hill Education.

Outcomes: Learn how data science is being applied to gain new insights about learner and instructor behavior.

The next generation of education technology will use open data to provide a foundation of learning analytics and adaptive learning. The new frameworks will give continuous quality personalized feedback to help align curricula and assessments and help students make course corrections in their learning. Using open data educators can provide measurement and reports that will affect learning outcomes. There are 3 areas that can be explored:

  1. Prescriptive – base line requirements
  2. Predictions – attention lost? off course? don’t understand
  3. Prescription – how to adjust – adaptive learning

Using adaptive learning environments and providing realtime feedback, students will have a pathway to what is known and what should be learned next. How can we take the power of adaptive products that are locked into software? Howe stresses the need for open architectures and standardized data output.

  • IMS Standards
  • LTI – interoperability for integration and identity
  • QTI – assessment interoperability
  • Caliper – a new standard for data exchange – common data exchange (JSON)

Howe shared a graphic of 3 main areas. The first area is the data source of learning events which is then converted to a  data common language. From that common language input APIs are processed in a learning analytics store house where the data is sorted. From that storehouse output APIs publish to products (phones, computers, dashboards, etc.) Howe claims you must start with the product that is trying to answer a question (goal outcome). Then you backwards architect how you sort the data.

Take away: The crux of the biscuit is always the open and standardized data source. Hard enough to do across a single institution let alone across many institutions. However, I don’t believe we’ve done enough here at Yale to leverage product API’s, and LTI’s in our LMS. However I know it’s in our sites and roadmaps. Future frontier looks bright.

Overall, I believe the conference themes that resonated throughout were learning analytics, hands-on learning assignments which give students the opportunity to fail and try again, and competency based learning objectives. And did I mention it was really warm and sunny there?

Published inConferences
Skip to toolbar