Research-Based Evidence for Interactive Engineering Homework: Key Findings from ASEE 2025

A seven-semester study presented at ASEE 2025 provides new data on interactive homework effectiveness in engineering courses. Dr. Yasaman Adibi, Dr. Adrian Rodriguez, and colleagues tracked 345 students across similar course conditions, varying only whether interactive textbook activities were assigned for points or recommended and not assigned for points. 

Key findings:

  • D/F grades in final exams decreased from 55% to 43% when interactive activities were assigned
  • Student engagement with course materials increased fivefold
  • ANOVA tests revealed exam performance was statistically significant with assigned interactive activities

The research addresses a persistent challenge in engineering education: improving student success rates while maintaining academic rigor within realistic resource constraints. (Read the research here.)

Research Design and Methodological Rigor

Controlled Variables Across Seven Semesters

The study’s strength lies in its experimental controls. Across all seven semesters from Fall 2021 to Fall 2024, the same instructor taught every section, using identical final exam content and grading rubrics throughout the study period. Course structure remained unchanged, with homework consistently weighted at 15% of the total grade through 16 assignments per semester. Students used the same zyBook—”Basic Engineering Circuit Analysis” by Irwin and Elms—ensuring consistent content coverage and problem types.

The primary experimental variable was straightforward: whether zyBook interactive activities were assigned for points as part of the homework grade or only recommended as supplemental resources. This single-variable approach, combined with the extensive controls, allows for confident attribution of observed differences to the homework implementation method rather than confounding factors.

Statistical Analysis Approach

Adibi’s team conducted multiple ANOVA tests to address potential confounding factors:

  • Within-group analysis: No significant differences between semesters within each group (p > 0.05)
  • Lab session control: Course lab structure changed from 14 to 10 sessions mid-study; ANOVA showed no significant impact on outcomes
  • Between-group comparison: Statistically significant difference in final exam performance (p = 0.022)

This level of methodological rigor distinguishes the study from typical educational technology evaluations that lack proper controls.

Sample Characteristics

Group 1 (Not assigned, 5 semesters): 243 students

  • zyBook activities recommended but not graded
  • 100% traditional paper homework for points
  • Average zyBook engagement: 37-83 minutes per semester

Group 2 (Assigned, 2 semesters): 102 students

  • 50% zyBook activities, 50% paper homework (by points)
  • Total homework workload kept equivalent through careful problem selection
  • Average zyBook engagement: 385-410 minutes per semester

What the Data Shows

Grade Distribution Changes

GradeGroup 1 (Not Assigned)Group 2 (Assigned)Change
A10.7%14.7%+37%
B19.8%24.5%+24%
C14.4%17.7%+23%
D21.0%17.7%-16%
F34.2%25.5%-25%

Summary impact:

  • Students receiving C or better: 45% → 57%
  • Students receiving D or F: 55% → 43%

Engagement Metrics

The most dramatic change was in time-on-task with course materials:

  • Without grade incentive: 37-83 minutes total per semester
  • With grade incentive: 385-410 minutes total per semester
  • Engagement increase: 5-fold average

This data confirms previous research showing that assignment structure significantly impacts student interaction with educational resources.

Pedagogical Mechanisms

How Interactive Activities Function

The zyBook platform implements three distinct types of learning activities, each designed to address different aspects of concept mastery. Animations provide step-by-step breakdowns of complex concepts, with completion tracking that ensures students engage with each component before moving forward. These visual explanations focus on building conceptual understanding before students attempt problem-solving tasks.

Learning questions use multiple choice, short answer, and matching formats to reinforce key concepts from each textbook section. Students receive instant feedback on their responses, with detailed explanations for both correct and incorrect answers. The system allows unlimited attempts, encouraging students to persist until they achieve understanding rather than simply moving on after an incorrect response.

Progression level activities represent the most sophisticated component of the interactive system. These problems mirror end-of-chapter textbook exercises but include multiple progressive difficulty levels. Students must demonstrate mastery at the foundational level before accessing intermediate problems, and complete intermediate work before attempting exam-level challenges. Each problem uses randomized values and, in some cases, varying circuit configurations, ensuring that students must understand underlying principles rather than memorize specific solutions.

Academic Integrity Through Design

The randomization approach addresses academic integrity concerns while actually enhancing learning outcomes. Problem values change with each attempt, and circuit configurations vary between students, making traditional copying strategies ineffective. When students submit incorrect answers, they receive complete worked solutions showing the methodology for that specific problem configuration. However, their next attempt presents different values, requiring them to apply the demonstrated approach to a new situation.

This design philosophy acknowledges that students will seek problem solutions regardless of course policies. Rather than fighting this tendency, the system channels it productively by providing high-quality explanations integrated with immediate practice opportunities. Students can’t simply copy answers because the values change, but they can learn from detailed solution explanations and apply those methods to demonstrate understanding. This approach transforms solution-seeking from an academic integrity violation into a learning opportunity.

Faculty Concerns and Implementation Realities

Q&A Session Insights

During the ASEE presentation, three faculty questions revealed common implementation concerns:

Concern 1: Loss of Individualized Feedback

Question: “Are we removing opportunities for students to evaluate others’ work and provide meaningful feedback?”

Rodriguez’s response: Acknowledged the ideal of individualized feedback while addressing resource realities. Most departments lack sufficient TA support for detailed feedback on every assignment. The interactive system provides immediate, consistent feedback that students actually use during the learning process, rather than delayed feedback that often goes unexamined.

Concern 2: Authenticity of Learning Experience

Question: “Do multiple attempts on a computer actually provide less rigorous practice than single-attempt paper homework?”

Key insight: The randomization component addresses this concern. When students must apply solution methods to new values after seeing worked examples, they demonstrate conceptual understanding rather than pattern matching or memorization.

Concern 3: Practical Implementation

Question: “How does the progression system actually work in practice?”

Implementation details:

  • Students cannot advance to higher difficulty levels without mastering prerequisites
  • Instructors receive data on student progress through each level
  • This information helps identify topics requiring additional lecture emphasis
  • Integration with learning management systems provides gradebook automation

Resource Allocation Considerations

The traditional homework approach requires significant instructor and TA time investment for grading, often resulting in feedback delays that reduce learning effectiveness. Students typically receive graded work days or weeks after submission, when the learning moment has passed and motivation to review mistakes has diminished. Additionally, manual grading provides limited insight into specific areas where students struggle, making it difficult to adjust instruction accordingly.

The use of interactive homework shifts resource allocation from grading time to instructional design and data analysis. Interactive components provide automated grading with immediate feedback, while instructors can focus on interpreting engagement data to identify topics requiring additional emphasis. This reallocation doesn’t reduce instructor involvement but redirects it toward more impactful activities: designing effective learning sequences, analyzing student progress patterns, and providing targeted interventions based on real-time data about student difficulties.

Implementation Framework

When This Approach Makes Sense

This interactive homework approach appears most effective in specific institutional and course contexts. Large enrollment courses where individualized feedback becomes impractical represent ideal candidates, particularly foundational courses where concept mastery proves critical for success in advanced coursework. Departments facing resource constraints that limit grading support may find the automated feedback components especially valuable.

The approach works best in courses with well-defined problem-solving methodologies, where step-by-step solution processes can be clearly articulated and systematically practiced. However, several limitations should be considered. The study’s single-instructor design limits generalizability, and results may not transfer directly to different course formats or teaching styles. Implementation requires careful attention to homework workload balancing to avoid simply adding requirements to existing assignments.

Practical Considerations

Successful implementation demands thoughtful integration rather than simple technology adoption. Total homework time must remain equivalent between traditional and hybrid approaches, requiring careful analysis of existing assignments to identify redundancies and optimize learning sequences. Adibi’s team removed paper problems that closely resembled interactive activities, ensuring students weren’t completing essentially identical work in different formats.

Broader Implications for Engineering Education

Evidence-Based Educational Technology Adoption

This study provides a methodological template for evaluating educational technology claims:

  • Statistical rigor, including confounding variable analysis
  • Practical outcome measures relevant to student success
  • Honest assessment of limitations and implementation requirements

Research Questions for Further Investigation

Scalability studies needed:

  • Multi-instructor implementations
  • Cross-institutional replication
  • Different engineering course types
  • Varying student population characteristics

Mechanism studies:

  • Which interactive features contribute most to improved outcomes?
  • Optimal balance between interactive and traditional homework
  • Long-term retention of concepts learned through different approaches

Conclusions and Next Steps

The study demonstrates that interactive homework can improve student outcomes when implemented with attention to pedagogical principles and proper incentive structures. The research quality—including rigorous controls and statistical analysis—provides a foundation for evidence-based decision making about educational technology adoption.

Key takeaways for faculty:

  • Interactive homework provides students with immediate feedback and multiple attempts
  • Grade incentives significantly impact student engagement with educational resources
  • Proper research design is essential for evaluating educational interventions

Questions for further consideration:

  • How might these findings apply to your specific course context?
  • What resource constraints affect educational technology decisions in your department?
  • What additional research would inform adoption decisions at your institution?

The ASEE presentation and subsequent discussion highlighted both the potential and the complexity of implementing research-based changes in engineering education. As Rodriguez noted, the goal isn’t to replace traditional teaching methods wholesale, but to optimize learning systems within realistic constraints while maintaining academic rigor.


This research was conducted using zyBooks interactive textbook platform. The study findings were presented at the 2025 ASEE Annual Conference & Exposition in Montreal.