At a Glance

So, your students are using AI tools like ChatGPT. Now what?

This guide offers a practical framework for designing AI-resilient learning experiences. We’ll walk through four immediate steps that you can take to support your students’ learning in the age of AI.

These are the assumptions behind our approach:

  1. Some of your students are using generative AI for coursework. In a Fall 2023 survey at MIT Sloan, 82% of student respondents said they had used generative AI for coursework that term. Other research suggests that students may use these tools regardless of your policies (Shaw et al., 2023).
  2. It’s almost impossible to police students’ use of generative AI. Since AI detectors don’t work, the only way to enforce a generative AI ban is to make students complete deliverables in person without devices. That approach can disadvantage otherwise high-performing students who struggle to write by hand or under tight time constraints.
  3. Generative AI use can help or hurt students’ learning. While some students use generative AI to avoid learning, others use those same tools to support their learning. MIT Sloan students report using generative AI to facilitate brainstorming, create visual aids, and translate course content.
  4. Students need to be able to recognize when AI gets it wrong. That means they need to build domain knowledge. It also means they’ll need to develop an understanding of how generative AI tools work, where they might trip up, and why they might not be useful for solving certain kinds of problems.
  5. You can design AI-resilient learning experiences. By thoughtfully planning learning experiences that take modern technologies into account, you can maximize your students’ learning in our generative AI-enabled world.

Tip: This guide centers around how you can maximize your students’ learning when they’re using generative AI. If you want to learn how you can use generative AI to augment your teaching, see Getting Started with AI-Enhanced Teaching: A Practical Guide for Instructors.

Our Design Framework

Our approach to AI-resilient learning design is grounded in a popular education framework called Understanding by Design. This framework is also commonly known as “backward design” because you start by thinking about what you want learners to achieve by the end of the learning experience.

We’ve augmented the traditional Understanding by Design framework to include these four steps, each of which takes generative AI into account.

Infographic showing the four steps to AI-resilient learning experience design

  1. Learners: Review your students’ backgrounds, goals, and likely interactions with generative AI.
  2. Learning Outcomes: Identify what students should know, understand, or be able to do by the end of this experience given AI’s capabilities.
  3. Assessments: Decide how students will demonstrate that they’ve achieved the learning outcomes given AI’s capabilities.
  4. Activities: Plan activities and resources that will help students build knowledge and skills given AI’s capabilities.
Download the 4 Steps PDF

During each step, you’ll probably think back and revise your previous plans. You can also think ahead to the next step. The most important thing is to start with the end in mind so you can help your students maximize their learning throughout the experience.

Tip: Before you start designing your AI-resilient learning experience, spend some time experimenting with generative AI tools. Think about how your students could use each tool to facilitate, augment, or bypass learning opportunities.

The AI-Resilient Learning Experience Design Toolkit

You can use the AI-Resilient Learning Experience Design Toolkit to help you apply the guidance in this article to your own course.

Download the Toolkit

If this is your first time exploring how AI could impact your course design, you might want to start small. For example, you can select just one class session or module to redesign. For a broader approach, you could think about redesigning an entire course.

Here are two examples of completed AI-Resilient LX Design Toolkits:

4 Steps to Design an AI-Resilient Learning Experience

Ready to adapt your teaching for the AI era? The first step in AI-resilient learning experience design is to gain a thorough understanding of your learners, so let’s start there.

1. Learners

Start by thinking about your students and their context. Building empathy for your learners will help you craft a learning experience that’s truly tailored to their needs.

1.1 Review Your Students’ Context

  • What program are they enrolled in?
  • Are there any prerequisites for this learning experience?
  • What prior knowledge and skills do they have?
  • What are their career goals?

1.2 Think About Students’ Interactions with Generative AI

  • What’s their prior experience with generative AI? Note that in a Fall 2023 survey, 81% of MIT Sloan student respondents said they used generative AI at least once a week.
  • What access to generative AI will they have during the learning experience? Students might use free or paid versions of other generative AI tools.
  • How might they use generative AI in their future professional roles?

Your learners’ context will provide an essential starting point as you think about outcomes, assessments, and activities for your AI-resilient learning experience.

Activity: Record your reflections on your learners and their context in Part 1 of the AI-Resilient Learning Experience Design Toolkit. As you write your response, consider referring to Toolkit Example 1: Organizational Leadership (PDF) and Toolkit Example 2: Data Analytics (PDF).

2. Learning Outcomes

Now that you know where your students are coming from, it’s time to define the destination: the learning outcomes.

Learning outcomes describe the intended results of the learning experience. They’re the “why” behind your assessments and activities.

In this step, you’ll write learning outcomes that account for your students’ backgrounds and generative AI’s impact.

Tip: The terms “learning goals” and “learning outcomes” are sometimes used interchangeably. In this guide, we’re sticking to the term “learning outcome” and using this definition for it: Learning outcomes are statements that describe what learners should know, understand, and be able to demonstrate after they’ve completed the learning experience.

2.1 Draft Learning Outcomes

Start by responding to this prompt: By the time your students complete this learning experience…

  • What should they be able to do?
  • What should they know and understand?
  • What should their opinions be about these topics?

If it’s helpful, you can also frame this question in terms of time: What should students remember from this experience in six months, or one year, or five years?

When you’re revising outcomes for an existing course, it can help to think about an assessment that you already have that works really well. Why does it work well? What do students demonstrate when they complete that assessment? (J. Rankin, personal communication, February 16, 2024).

Tip: When designing an entire course, best practice is to define 5-10 total learning outcomes (McCourt, 2007) with broader scopes. For a single unit (or one week, lesson, or session), aim for just 1-3 learning outcomes with narrower scopes. Defining the right number of outcomes can help keep the experience manageable for you and your students.

2.2 Consider AI’s Impact on Learning Outcomes

Now, it’s time to make sure you’re creating an AI-resilient learning experience. Are there topics that you’d like to 1) retire or 2) promote given AI’s capabilities?

  • Retire: Just like fluency with a slide rule became less important after the invention of the pocket calculator, you might want to rethink some topics given generative AI’s capabilities. Consider whether generative AI alone could “achieve” any of your existing learning outcomes (J. Rankin, personal communication, January 11, 2024).
  • Promote: Some areas might be newly important in a generative AI-enabled age. For example, would you like students to be able to identify common misconceptions that could surface in AI hallucinations? Or be able to evaluate whether subject-related tasks fall within or outside generative AI’s current capabilities?

MIT Sloan students have told us that they appreciate transparency from faculty around how AI tools might impact their learning. You’ll be better able to guide conversations around AI-resilient learning if you start by getting a clear handle on if, when, how, and why AI could affect your student’s experiences.

2.3 Refine and Rewrite Learning Outcomes

Review your brainstorming. Then write your revised learning outcomes by completing this prompt: “By the time students finish the learning experience, they should be able to [ACTIVE VERB]…”

MIT’s Teaching and Learning Lab blog explains why this format is useful:

(1) by using active verbs, we articulate actions that students will be able to do. These actions can be observed and compared with our desired results;

(2) it puts the focus on students (note: it is not “By the end the course the instructor will have…”), and advances the ultimate goal of backward design–-to facilitate learning, not simply to “cover” content.

Here’s an example: “By the time students finish this course, they should be able to design a digital marketing campaign and analyze its effectiveness through key performance indicators.”

When selecting an active verb for your learning outcome, we recommend using a verb that aligns with one of the levels of Bloom’s Taxonomy.

Bloom’s Taxonomy is a framework that can help you craft effective learning outcomes, align them with assessments, and guide students in working their way from lower-level cognitive skills (e.g., remembering) to higher-level skills (e.g., creating).

This Bloom’s Taxonomy infographic from the Vanderbilt University Center for Teaching shows the levels of Bloom’s Taxonomy and examples of active verbs that align with each level (Armstrong, 2010).

Bloom's Taxonomy infographic with levels and verbs

Source: Bloom’s Taxonomy [graphic], by Center for Teaching Vanderbilt University, 2016, Flickr (flickr.com/photos/vandycft/29428436431). CC BY 2.0 DEED.

Bloom’s Taxonomy Level  Definition  Example Verbs 
Create  Produce new or original work  create, generate, plan, produce, design 
Evaluate  Justify a stand or decision  evaluate, check, critique, assess 
Analyze  Draw connections among ideas  analyze, differentiate, organize, attribute 
Apply  Use information in new situations  apply, execute, implement 
Understand  Explain ideas or concepts  interpret, classify, summarize, infer, compare, explain 
Remember  Recall facts and basic concepts  recall, recognize, identify 

Would you like your students to be able to “justify a stand or decision”? Consider using a verb like appraise, argue, defend, judge, etc. Alternately, if you’d like to focus on students’ ability to “use information in new situations,” you could use a verb like execute, implement, solve, use, etc.

Given the context of generative AI, you may want to prioritize verbs from the Apply, Analyze, Evaluate, and Create levels of Bloom’s Taxonomy. These verbs align with higher-order thinking skills that could be increasingly essential to students in an AI-enabled world (Genone & Hughes, 2023).

As described in the Teaching & Learning Lab blog, your final learning outcomes should be:

  • Specific: Craft outcomes that plainly state what learners should achieve. Use active verbs that refer to tangible skills and knowledge. Avoid vague language that does not give a precise sense of expected abilities and competencies.
  • Measurable: There should be tangible evidence that students have achieved the outcomes, whether through rubrics, evaluations, or other methods. Outcomes should focus on demonstrable skills and knowledge that can be meaningfully assessed after the learning experience.
  • Realistic: Consider the timeframe, scope, and resources for the learning experience. Make sure you set challenging but feasible outcomes that students can reasonably accomplish given constraints. Align the outcomes with course level, student background, and other contextual factors.

The goal is to craft outcomes that give you and your students clear, focused targets to work towards. The outcomes signal what subject mastery looks like for this learning experience. Well-defined outcomes facilitate AI-resilient learning activity design and help you plan effective assessments of student progress.

2.4 Example Learning Outcomes

These are some example course-level learning outcomes:

  • Develop a business plan for a start-up, incorporating market analysis, financial forecasting, and strategic planning.
  • Conduct a detailed case study analysis of a multinational corporation, focusing on its operational, financial, and ethical challenges.
  • Apply project management methodologies to manage complex business projects, ensuring timely delivery and quality standards.
  • Communicate business strategies effectively, crafting persuasive presentations and reports for diverse stakeholders.

These are some example session- or module-level learning outcomes:

  • Interpret the key elements of a balance sheet and income statement in a case study.
  • Create a financial forecast for a hypothetical business scenario using Excel or a similar tool.
  • Design a project plan for a small business initiative, applying basic principles of project management.
  • Discuss the implications of cultural differences in international business.
  • Prepare and deliver a persuasive business presentation using digital tools.

Additional Resources: Learning Outcomes

  • MIT Teaching + Learning Lab – Where to Start: Backward Design – This resource from MIT’s Teaching + Learning Lab guides you through crafting specific, measurable, and student-centered learning outcomes. Understand how to align assessments and activities with these outcomes for a more intentional and effective learning experience.
  • Oregon State University Ecampus: Bloom’s Taxonomy Revisited – This resource revisits Bloom’s Taxonomy to help you align course activities and assessments with generative AI capabilities. It emphasizes the need to review and amend learning outcomes, especially at the Remember and Analyze levels, to foster distinctive human skills in an AI-enabled world.

Activity: Go to Part 2 of the AI-Resilient Learning Experience Design Toolkit and write your learning outcomes. As you write your response, consider referring to Toolkit Example 1: Organizational Leadership (PDF) and Toolkit Example 2: Data Analytics (PDF).

3. Assessments

Now, follow these steps to decide what students will produce to show they’ve achieved the learning outcomes.

3.1 Choose the Learning Outcome(s)

Review your list of learning outcomes. Choose the learning outcome(s) you’d like to focus on for this assessment.

3.2 Determine the Assessment Task(s)

Complete this prompt: “Students will show that they’ve achieved the learning outcome(s) by…”

Consider these guidelines:

  • Use varied assessment methods. Don’t rely on a single type of assessment throughout a course. Include both formative and summative assessments. Mix traditional methods like quizzes with presentations, projects, case studies, and other formats.
  • Incorporate real-world tasks. Include assessments that mimic real-life challenges. How can students apply their skills and knowledge in a way that mirrors tasks they might complete in a professional context?
  • Align each assessment with at least one learning outcome. The key to alignment is matching the assessment method to the skill, complexity, and cognitive process that you already selected when crafting your learning outcome. For example, if you want students to be able to make connections between real situations and the 3 Lenses Framework, you can’t just ask them to explain the framework. Instead, you might ask them to use that framework as the basis for analyzing a case study.

Here’s an overview of possible assessment types aligned with Bloom’s Taxonomy active verbs, based on the University of Louisville resource Use Bloom’s Taxonomy to Align Assessments:

Bloom’s Taxonomy Level  Example Verbs  Example Assessments 
Create: Produce new or original work.  create, generate, plan, produce, design  Research projects, compositions, essays, business plans, website designs requiring making, building, designing something new 
Evaluate: Justify a stand or decision.  evaluate, check, critique, assess  Journals, product reviews, studies requiring judging against criteria 
Analyze: Draw connections among ideas.  analyze, differentiate, organize, attribute  Case studies, critiques, papers, debates, concept maps requiring differentiation, determining relevance, function, bias 
Apply: Use information in new situations.  apply, execute, implement  Problem sets, performances, labs, simulations requiring using procedures for familiar & unfamiliar tasks 
Understand: Explain ideas or concepts.  interpret, classify, summarize, infer, compare, explain  Papers, exams, discussions, concept maps requiring summarization, classification, comparison, paraphrasing, finding examples 
Remember: Recall facts and basic concepts.  recall, recognize, identify  Objective questions (fill-in-blank, matching, labeling, multiple choice) requiring students to recall or recognize facts, terms, concepts 

3.3 Determine Acceptable AI Use

Think about how your assessment will account for generative AI. At the most basic level, for any given assessment, you can require, allow, or ban generative AI use. Consider both your students’ context and the learning outcomes you’ve developed as you decide which is the best approach for the assessment.

Tip: No matter what your approach, transparency is key. Be clear with your students about when and why they’re allowed or not allowed to use generative AI during assessments. Whenever possible, provide examples of appropriate and inappropriate generative AI use.

Option 1: Require Generative AI

This means you’ll require students to use generative AI while they demonstrate that they’ve achieved the learning outcomes. Consider this approach when AI use will enable effective new and effective assessment opportunities.

Option 2: Allow Generative AI

This means allowing (but not requiring) students to use generative AI. For example, if your students in an entrepreneurship course are learning about pitching different kinds of products, you might allow them to use generative AI to brainstorm products and create images to include in their pitch materials.

Tip: Requiring or allowing generative AI for assessments could be especially beneficial if, in related professional contexts, learners will be expected to leverage generative AI.

Option 3: Don’t Allow Generative AI

This means telling students they should not use generative AI for the assessment. Here are some contexts where this might be the best approach:

  • Foundational knowledge: For assessments focused on foundational knowledge within a field, it could be difficult to tell whether a correct response can be attributed to the student’s knowledge versus a lucky “roll of the dice” with a generative AI tool.
  • Real-world constraints: For assessments meant to simulate real-world constraints, the target professional settings might restrict generative AI use due to practical barriers (like time, cost, or data privacy). In those cases, mirroring those parameters could promote authentic skill-building.

Since AI detectors don’t work, you can only enforce an AI ban by asking students to complete assessments in the classroom without a laptop. However, that approach can disadvantage students with disabilities and other marginalized learners—as well as any student who struggles to write by hand or under tight time constraints.

Before facilitating an in-person, no-devices assessment, make sure to review MIT Disability and Access Services’ resource Faculty Procedures for Providing Academic Accommodations. Also, make sure your students have plenty of time and opportunity to request accommodations for these assessments.

Additional Resources: Assessments

  • MIT Teaching + Learning Lab – Assess for Learning: This resource from MIT’s Teaching and Learning Lab offers guidance on designing assessments that not only measure learning but also contribute to it. It outlines strategies for aligning assessments with learning outcomes. It also highlights the importance of formative assessments in providing feedback to both students and instructors.
  • Vanderbilt University Center for Teaching – Student Assessment in Teaching and Learning: This comprehensive guide covers various aspects of student assessment in higher education. It discusses different methods, the purpose of assessments, and the importance of aligning them with learning objectives. It’s a valuable resource for understanding the multifaceted nature of assessment beyond just grading.

Activity: Go to Part 3 of the AI-Resilient Learning Experience Design Toolkit to write your plan for assessments. As you write your response, consider referring to Toolkit Example 1: Organizational Leadership (PDF) and Toolkit Example 2: Data Analytics (PDF).

4. Activities

How will you make the best possible use of in-person class time during your AI-resilient learning experience? Consider “flipping” your class.

In a flipped classroom, students build core knowledge outside the classroom through targeted homework assignments. Then in-person class time centers around active learning experiences. On the Vanderbilt University Center for Teaching’s page Active Learning, Cynthia Brame (2016) describes flipped learning in the context of Bloom’s Taxonomy:

This means that students are doing the lower levels of cognitive work (gaining knowledge and comprehension) outside of class, and focusing on the higher forms of cognitive work (application, analysis, synthesis, and/or evaluation) in class, where they have the support of their peers and instructor.

Studies have shown that flipping the classroom and implementing active learning can result in significant learning gains for students (Freeman et al., 2014; Jensen et al., 2017; Deslauriers et al., 2011).

A flipped classroom approach may be especially useful in the era of generative AI. Interactive in-person sessions give you the opportunity to support students’ learning regardless of whether and how they’re using generative AI outside the classroom. These sessions are a time to meet your students where they are—and to help them build the professional expertise they’ll need in their future careers.

Tip: The value of a flipped classroom comes from dedicating freed-up class time to interactive and engaging activities that align with your learning goals. Moving lectures outside of class without in-class active learning may not have positive impact (Jensen et al., 2017).

This section on AI-resilient learning activities will walk through these steps:

  • Choose the Learning Outcome(s): What learning outcome will the prework and activity help students build towards?
  • Design the Prework: What pre-class activities and resources will help students build the foundational knowledge to participate in in-class activities? Consider allowing or encouraging AI-use for outside-of-class prework.
  • Design the Activities: What in-class activities will students take part in to engage with the learning outcome(s)?
  • Determine Acceptable AI Use: Will you require, allow, or ban AI use for the prework and in-class activity?

4.1 Choose the Learning Outcome(s)

Review your list of learning outcomes. Choose the learning outcome you’d like to focus on for this set of learning activities.

4.2 Design the Prework

Decide what pre-class work students will need to do to prepare for the session. Consider readings, videos, case studies, and low-stakes quizzes. Create or curate these resources and activities and post them on your Canvas course site.

A lot of people associate the flipped classroom with lecture videos. However, flipped learning doesn’t need to include any videos. Before class, it might make more sense for students to read a case study, take a low-stakes Canvas quiz, or even have a conversation with an AI chatbot.

Videos can be great—and you’re very welcome to connect with us about creating videos in the Teaching Studio. However, effective prework can take many different forms.

What if students don’t do the prework?

A key challenge in flipped classrooms is that there’s no guarantee students will do the assigned prework. Here are some strategies to tackle the issue:

  • Clear Expectations: Start by making it clear to students why prework is crucial. Explain how it will benefit their learning. Make sure they understand that in-class activities will build directly on the prework.
  • Accountability: Start class with a quiz about the prework. This approach encourages students to do the work and also gives you a quick overview of their understanding. Frequent low-stakes quizzing also gives students the chance to practice retrieval, which supports the learning process (Rankin, 2024).
  • Flexible In-Class Activities: Design class activities to cater to different levels of preparation. You can have tiered activities where students who haven’t done the prework can start at a more basic level, while those who have can engage in more advanced or application-based tasks. This way, you’re not holding back students who are prepared, but you’re also not leaving behind those who aren’t.
  • Student Feedback: Gather feedback from your students regularly (not just at the end of the term). Find out what’s working for them and what isn’t. Maybe there are reasons you can address for why they’re not doing the prework.

The goal is to create a learning environment where students see prework as both important and manageable.

4.3 Design the Activities

In a flipped classroom, the real magic happens live. Since students are already familiar with the topic, you can center class time around active learning that helps students build towards the learning outcomes. In this context, insight emerges from students sharing meaningful perspectives together and collaborating to build on prior knowledge.

Consider the examples below of how prework can lead to in-class active learning.

 Activity Type  Prework  In-Class Activity 
Reading and Discussion  Assign relevant chapters or articles. Provide guiding questions to focus students' reading.  Conduct a student-led discussion about the readings, encouraging students to apply concepts to case studies or current industry examples. 
Case Study Exploration  Choose an MIT Sloan case study or HBP case study that aligns with your learning outcomes. Guide students to identify key strategic decisions as they read the case study before class.  Organize breakout groups for in-depth case analysis, followed by presentations and a full-class discussion to connect case findings with broader course concepts. 
Problem Sets  Assign problem sets that apply course concepts to practical scenarios. Instruct students to prepare justifications for their answers.  Facilitate a collaborative class review of the problem sets. Have students work in pairs or small groups to discuss and compare their solutions, then share insights with the class. 
Simulations  Choose an MIT Sloan simulation or an HBP simulation. Assign students to review related materials before class. Conduct the interactive simulations in class, followed by a guided discussion to analyze decisions, strategies, and link to real-world implications. 
Strategy Discussions  Assign students to develop strategies related to course content, encouraging research and critical analysis.  Facilitate a workshop-style discussion where students present their strategies and receive peer and instructor feedback. 
Debates  Assign roles or perspectives based on course material or case studies. Guide students in researching their assigned positions.  Conduct structured debates in class. Have each student argue their perspective and then facilitate a reflective conversation. 
Guest Speaker Sessions  Provide students with information about the speaker and encourage them to prepare questions related to course themes.  Invite the guest speaker to interact with students during class. Lead a dynamic Q&A, encouraging students to ask prepared and impromptu questions. Follow up with a discussion segment where students reflect on the speaker's insights in relation to the course content.  

4.4 Additional Resource: Flipped Classrooms and Active Learning

4.5 AI Use

Like with assessments, you have three basic options for how to account for AI: you can require, allow, or ban AI use.

Tip: No matter what your approach, transparency is key for AI-resilient learning experiences. Be clear with your students about when and why they’re allowed or not allowed to use generative AI during in-class activities. Whenever possible, provide examples of appropriate and inappropriate generative AI use.

Option 1: Require Generative AI

You can require generative AI use if that will open the door to activities that are closely aligned with your learning outcomes. Consider these examples of AI-integrated learning experiences:

Option 2: Allow Generative AI

You can give students the option of using generative AI without requiring its use. This approach may be best if generative AI use won’t facilitate or impede students’ engagement with the learning outcomes. It also helps if students are familiar enough with the topic that they’ll be able to tell if the AI tools are hallucinating.

For example, in a marketing strategy class, you could give students the option of using AI during certain steps of creating a campaign proposal. Some might choose to use AI for generating initial ideas or ad copy, while others could rely on traditional brainstorming. This flexibility would allow students who are more comfortable with AI to explore its potential, while those who preferred traditional methods could apply those strengths. The learning outcomes—which might be involve aligning with marketing principles and developing a coherent strategy—remain central, regardless of what tools students use as they work to achieve them.

Tip: Requiring or allowing generative AI use for in-class activities could be especially beneficial if, in professional contexts related to these learning outcomes, learners will be expected to leverage generative AI.

Option 3: Don’t Allow Generative AI

Prework: Since generative AI detectors don’t work, banning AI for prework puts you in the difficult position of having a policy that you can’t enforce (J. Rankin, personal communication, January 11, 2024). If students know that AI use will be banned during the related in-class assessment, that may motivate the majority to complete even outside-of-class prework without AI support. However, for graded outside-of-class prework, you may run into equity issues if some use generative AI regardless of your policies.

In-Class Activities: In-class activities offer more flexibility when it comes to generative AI use. You can prevent students from using generative AI in class if you don’t allow access to devices during activities. Here are some contexts where that might be the best approach:

  • Foundational knowledge: For activities focused on building foundational knowledge within a field, permitting AI tools could undermine that objective. Students developing core competencies might not have the context to critically evaluate AI output.
  • Real-world constraints: If target professional settings restrict certain tools due to practical barriers (like time, costs, or data privacy), mirroring those parameters could promote authentic skill-building.

4.6 Maximizing the Flipped Classroom

To make the most of your class activities and prework, follow these guidelines:

  • Match every activity to specific learning outcomes: Keep your course focused and your students on track by making sure each activity aligns with at least one learning outcome.
  • Weave in formative assessments: Incorporate low-stakes formative assessments throughout the course so you and your students can gauge their progress and target areas that need work.
  • Provide clear instructions: Make sure students understand the purpose and expectations of both prework and in-class activities.
  • Encourage active participation: Design in-class activities that require students to actively engage with the material. This might include discussions, projects, or problem-solving exercises.
  • Integrate technology thoughtfully: Use technology to enhance learning, not just for its own sake. Ensure any technology supports students’ progress towards the learning outcomes.

By flipping your class and emphasizing active learning, you create a dynamic learning environment where students engage with complex topics, use what they’ve learned, and think critically. This approach not only makes your course more interactive and engaging but also gives students the chance to apply new knowledge and skills regardless of AI’s impact.

Activity: Go to Part 4 of the AI-Resilient Learning Experience Design Toolkit to write your plan for activities and resources. As you write your response, consider referring to Toolkit Example 1: Organizational Leadership (PDF) and Toolkit Example 2: Data Analytics (PDF).

What’s the Research?

This approach to AI-resilient learning experience design is based on foundational education research along with recent findings related to generative AI. We’ve tried to provide a robust foundation for you to build on as you guide your students towards a future where AI could play an important role in their work.

Education Research

Our approach is grounded in these research-based instructional design approaches:

  1. Backward Design: This strategy, also known as Understanding by Design (UbD), begins with setting clear, human-centric goals and objectives. By first determining what students need to learn, educators can plan the course content and activities that lead to these outcomes.
  2. Authentic Assessments: Assessing students through assignments that closely relate to real-world situations provides a more accurate measure of their ability to apply what they’ve learned in practical situations. This approach aligns with the work of Grant Wiggins, who emphasizes the value of assessments that require students to perform tasks that demonstrate meaningful application of essential knowledge and skills (1998).
  3. Flipped Classrooms and Active Learning: Maximizing in-person class time for discussions, debates, role-plays, and workshops aligns with research on active learning. Studies like those by Freeman et al. (2014) show that active learning strategies can significantly improve student performance in science, engineering, and mathematics. By flipping the classroom, we prioritize hands-on, interactive learning experiences over traditional lecture-based teaching, fostering a deeper understanding and application of course material.

Generative AI Research

On the generative AI side of things, let’s start with some research that took place within MIT Sloan. In a Fall 2023 survey about students’ use of generative AI, 97 respondents (82%) said they had used generative AI for coursework this term. 21 (18%) had not. (n=118)

Pie chart showing student AI use at MIT Sloan in Fall 2023

Outside MIT, an April 2023 Tyton Partners survey of 2,000 students from two- and four-year private and public institutions found that 51% of student respondents said they’d use generative AI for coursework even if it were prohibited by their instructors or institutions (Shaw et al., 2023). It’s reasonable to assume that some MIT Sloan students might feel the same way.

In terms of life after graduation, research suggests that our students may benefit from knowing how to effectively integrate generative AI into their work. In their article “Navigating the Jagged Technological Frontier,” Dell’Acqua et al. (2023) suggest “that the capabilities of AI create a ‘jagged technological frontier’ where some tasks are easily done by AI, while others, though seemingly similar in difficulty level, are outside the current capability of AI” (p. 2). Their study with BCG Consulting Group found the following:

  • For tasks considered to be within generative AI’s current capabilities, or “inside the jagged frontier,” AI use improved consultants’ performance across the board.
  • For tasks outside the frontier, consultants using AI “were 19 percentage points less likely to produce correct solutions compared to those without AI” (Dell’Acqua et al., 2023, p. 2).

What’s the catch? No one knows exactly what will fall inside or outside AI’s capabilities at a given time. As MIT Sloan alum and Wharton Associate Professor Ethan Mollick describes, “On some tasks AI is immensely powerful, and on others it fails completely or subtly. And, unless you use AI a lot, you won’t know which is which.” (Mollick, 2023).

The Dell’Acqua et al. study (2023) suggested that consultants assigned to outside-the-frontier tasks were most successful when they interrogated AI and exercised expert judgment while leveraging AI for the task. Those who were least successful “tended to blindly adopt its [AI’s] output and interrogate it less.”

Given the productivity gains associated with AI use “inside the jagged frontier,” at least some of our students will probably be expected to use these tools in the future (Dell-Acqua et al., 2023). Effectively navigating that jagged frontier will require our graduates to apply expertise and use professional judgement rather than blindly adopting AI outputs. We can help by preparing students for “engaged augmentation” via generative AI—that is, knowing how to effectively engage with AI and check its outputs if they do incorporate these tools into their professional workflows (Lebovitz et al., 2022).

Conclusion

As new technologies reshape education, the framework in this guide offers a practical approach for designing AI-resilient learning experiences. Regardless of technologies’ impacts, though, some things haven’t changed: We still want students’ learning to help them become creative, ethical leaders ready to synthesize perspectives, make discerning judgments, and direct technology to benefit business and society.

References

Armstrong, P. (2010). Bloom’s taxonomy. Vanderbilt University Center for Teaching. https://cft.vanderbilt.edu/guides-sub-pages/blooms-taxonomy

Bowen, R. (2017). Understanding by design. Vanderbilt University Center for Teaching. https://cft.vanderbilt.edu/understanding-by-design

Brame, C. (2016). Active learning. Vanderbilt University Center for Teaching. https://cft.vanderbilt.edu/active-learning

Center for Innovative Teaching and Learning. (n.d.). Authentic assessment. Indiana University Bloomington. https://citl.indiana.edu/teaching-resources/assessing-student-learning/authentic-assessment/index.html

Center for Innovative Teaching and Learning. (n.d.). Summative and formative assessment. Indiana University Bloomington. https://citl.indiana.edu/teaching-resources/assessing-student-learning/summative-formative/index.html

Delphi Center for Teaching and Learning. (n.d.). Use Bloom’s taxonomy to align assessments. University of Louisville. https://louisville.edu/delphi/resources/-/files/resources/pages/Blooms-Taxonomy-Handout.pdf

Dell’Acqua, F., McFowland, E., Mollica, E., Lifshitz-Assaf, H., Kellogg, K., Rajendran, S., Krayer, L., Candelon, F., & Lakhani, K. (2023). Navigating the jagged technological frontier: Field experimental evidence of the effects of AI on knowledge worker productivity and quality (Harvard Business School Technology & Operations Mgt. Unit Working Paper No. 24-013). https://dx.doi.org/10.2139/ssrn.4573321

DesLauriers, L., Schelew, E., & Wieman C. (2011). Improved learning in a large-enrollment physics class. Science 332: 862-864. https://www.science.org/doi/10.1126/science.1201783

Freeman, S., Eddy, S., McDonough, M., Smith, M., Okoroafor, N., Jordt, H., & Wenderoth, M. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences 111(23). https://www.pnas.org/doi/full/10.1073/pnas.1319030111

Genone, J., & Hughes, S. (2023). Integrating Artificial Intelligence [White paper]. Minerva Project and the Edmond de Rothschild Bridge for Higher Education and Employment. https://www.minervaproject.com/white-paper/integrating-artificial-intelligence

Jensen, J., Kummer, T., & Godoy, P. (2017). Improvements from a flipped classroom may simply be the fruits of active learning. CBE—Life Sciences Education, 14(1). https://www.lifescied.org/doi/full/10.1187/cbe.14-08-0129

Leibovitz, S., Lifshitz-Assaf, H., & Levina, N. (2022). To engage or not to engage with AI for critical judgments: How professionals deal with opacity when using AI for medical diagnosis. Organization Science 33(1), 126-148. https://doi.org/10.1287/orsc.2021.1549

Mollick, E. (2023). Centaurs and cyborgs on the jagged frontier. One Useful Thing. https://www.oneusefulthing.org/p/centaurs-and-cyborgs-on-the-jagged

Rankin, J. (2024, January 11). GenAI in course design. [PowerPoint presentation]. MIT Canvas.

Shaw, C., Bharadwaj, R., NeJame, L., Martin, S., Janson, N., & Fox, K. (2023). Time for class 2023: Bridging student and faculty perspectives on digital learning. Tyton Partners. https://tytonpartners.com/time-for-class-2023-bridging-student-and-faculty-perspectives-on-digital-learning

Teaching + Learning Lab. Where to start: Backward design. https://tll.mit.edu/teaching-resources/course-design/where-to-start-backward-design

Wiggins, G. (1998). Ensuring authentic performance. In Educative assessment: Designing assessments to inform and improve student performance (pp. 21-42). Jossey-Bass.