THE CLIENT

Innovate for Africa Fellowship, was a fellowship program run by Novustack, an organization that built innovation ecosystems across Africa from January 2020 to June 2023. The fellowship identifies, trains and inspires aspiring African entrepreneurs. After a rigorous full-time one-month training, fellows are matched to international startups for a chance to be placed for paid internships. They are also offered professional development in a chosen career path. The program ran a Design Thinking synchronous module and an Innovation readiness asynchronous module.

MY ROLE

I joined the team after the pilot in October 2021. I had two three roles:

Learning designer, where I assisted in the editing and rewriting of the four-week Design thinking course and edited video recordings for the Innovation Readiness course.

UX researcher/ monitor and evaluation officer to inform the redesign of the same course for the next round.

LEARNING DESIGN

Using Google Suite the lesson slides for the innovation curriculum based on the feedback from the first round:

design best practices: ensure branding consistency (design own images instead of using other material in wrong font and color) reduce amount of text in slides.

learning best practices: ensure flow in the lessons, and chunk the work flow and allow more time for discussion, questions and reflection.

example of branding consistencies across the 18 lessons, with recognizable slides

In addition, we edited the script for the facilitators and tested it for timing to make sure we meet those goals.

Screenshot  of  script with time prompts, links and instructions for the facilitators.

EVALUATION

In this project, I led a comprehensive evaluation of a learning program with the aim of measuring its effectiveness and informing the redesign of its next iteration. My role focused on identifying strengths and growth areas through data collection, analysis, and strategic recommendations—while ensuring the program remained responsive to the needs of a diverse and evolving learner community. In collaboration with the team the following objectives where identified:

Objectives

Assess how effectively the program was meeting its stated goals.

Understand participant performance and what influenced it.

Provide actionable insights to guide the next version of the program.

Methods

After completing the logic model for the organization and reviewing the theory of change four methods were chosen for this evaluation:

A fellow survey

fellow semi-structured interview

facilitator interview with both fellows and facilitators

analysis of fellow outputs.


After the overall analysis was completed for the deliverables, a facilitator went over the findings, leaving additional comments and insights to contextualize the findings. This partner check ensured the involvement of the client throughout the process. Given the amount of data available, we underestimated how labor intensive this process was going to be. What immediately became apparent, is one of the central findings of this tool, is the lack of a simple system for data collection. It took several days to collect all relevant data in one central location, since browsing through fellow portfolios proved difficult since locations and titles of the required tasks made it difficult to locate.


Key findings

Strengths identified:

  • Participants reported a strong sense of belonging and satisfaction.

  • Core community values were clearly reflected in participant outputs.

  • Evidence of personal growth and increased self-confidence was consistently present across reflections and submissions.

Areas for improvement:

  • Limited systems for formally tracking participant progress and identifying those in need of support.

IFA core values where clearly reflected in participant outputs. The values where at the center of every session

Recommendations

Data Collection & Evaluation Systems

  • Implement a continuous data collection approach inspired by “learning organization” practices.

  • Centralize storage of key documents (rubrics, graded work) during the program lifecycle to streamline end-stage evaluations.

Program Design & Assessment Tools

  • Redesign rubrics for clarity and precision using measurable, descriptive criteria.

  • Strengthen assessment instruments to better align with learning goals and outcomes.

Instructional Structure & Engagement

  • Embed role-taking and verbal participation expectations into session materials to ensure equitable engagement.

  • Introduce clear guidelines for feedback and discussion to nurture collaboration and growth mindset.

  • Create intentional space for reflection (e.g., written journals, surveys, 1:1 interviews).

  • Integrate community values such as collaboration, resilience, and growth into lesson content.

  • Stagger key assignments (especially résumé and bio drafts) to reduce overload and increase quality.

  • Allow flexibility in deliverables to support individual learning journeys.

Facilitator Experience

  • Redefine the facilitator role to be more deeply integrated with the program’s structure and objectives.

  • Design a meaningful onboarding experience that provides facilitators with time to understand the curriculum and build rapport with learners.

Reflection

This program has already been piloted once, and the team showed excellent commitment to continuous improvement. The evaluation was more than a diagnostic — it was a design opportunity. By centering learner experience and integrating reflection, feedback, and flexibility into the structure itself, the program was positioned not only to evolve, but to do so with intention. For example, facilitator material was expanded to include rubrics to formally track student feedback

these screenshots of the facilitator material reflect how rubrics and data collection got integrated into the program where the facilitator provides clear rubrics to the students (image on the left) and collects data from students continuously (in the lesson).