Data with Sheets
Data with Sheets

Data with Sheets

(Powered by Learning Companions)

1. What is “Data with Sheets”?

“Data with Sheets” is a self-paced learning programme that uses real-world data and Google Sheets to build three things at the same time:

  1. Spreadsheet skills and formula literacy
  2. Structured, step-by-step problem-solving habits
  3. Confidence and discipline for self-directed learning

It does this through carefully designed “problem scenarios” rather than isolated formula drills. Each scenario is broken into small, game-like tasks that learners can pick up and complete in short focused sessions.


2. Who is this course for?

The course is designed primarily for adults and young adults who:

  • Are comfortable with basic computer use
  • Have very little prior experience with spreadsheet formulas
  • Want to analyse real work / life data (not toy examples)
  • Need external structure but minimal day-to-day teaching support

The course can be used both directly with learners (fellows, youth) and indirectly as a “problem universe” for facilitators to use with children later.


3. Core design principles

The course is built around a few non-negotiable design choices.

3.1 One new idea at a time

  • Each task introduces at most one genuinely new concept, skill, or piece of knowledge.
  • Everything else in that task must be already familiar to the learner.
  • This is critical because mathematics and data work are cumulative: small gaps in prior concepts can quickly compound.

3.2 Start with “dumbest and easiest” tasks

  • Early challenges are intentionally simple and “obviously solvable”.
  • The aim is to remove fear, create quick wins, and allow people to get used to the process and environment without shame.

3.3 Think of each task as a game

  • A learner never sees a huge mission; they see one game at a time.
  • Even when a concept needs many steps to master, it is implemented as a chain of small, independent-feeling games.

3.4 Real-life consequences as motivation

For each problem scenario, tasks are tied back to “what this means in real life”:

  • How this insight or skill could impact money, work, decisions, or projects
  • How closing this conceptual gap strengthens their overall “math resource universe”

This explicit consequence link is used repeatedly to maintain motivation.

3.5 English as the common language

Even when underlying data may be multilingual, the “game world” of tasks, instructions, and reflections is kept in English to build comfort with the language of tools and documentation.

3.6 Process and insight > solution

  • The course explicitly values reasoning steps, checks, failed attempts, and insights.
  • Solutions matter, but only as vehicles to get to deeper understanding and better strategies.

4. Course structure

At the heart of the course are:

  • Concepts (e.g. ranges, COUNTIF, VLOOKUP, basic statistics)
  • Problem scenarios based on real data
  • Mastery levels within each concept

4.1 Concepts and problem scenarios

Each major concept is represented by one or more rich “problem scenarios”, not just formula worksheets. For each scenario:

  • Tasks are prepared to cover all key aspects of working with that scenario.
  • Tasks move from well-defined problems to more open, messy ones.

4.2 Mastery levels: Foundation, Advanced, Expert

Within each concept, problems are grouped into mastery levels:

  • Foundation
    • Focus on single-step problems and basic use of functions.
    • Most learners are expected to do a significant amount here before touching advanced tasks.
  • Advanced
    • Multi-step questions that combine several basic skills.
    • Longer data ranges, multiple conditions, and more ambiguous wording.
  • Expert
    • Unstructured, messy problems closer to real work contexts.
    • Emphasis on problem framing, trade-offs, and insight generation.

Learners can choose to advance concept-by-concept (e.g. Foundation+Advanced for Ranges, only Foundation for Pivot Tables, etc.), based on their preparedness and interest.

4.3 Sets of tasks, not isolated questions

  • Problems are organised into sets; each set is visually grouped (e.g. coloured rows) in the taskbook so learners can see boundaries clearly.
  • A single set may be enough work for several weeks; learners are encouraged to solve only a few tasks per week (e.g. 2–5) but with depth.
  • Completion is tracked at the set level rather than question-by-question.

4.4 The “six aspects” of a problem scenario

For each scenario, tasks are designed so that over time they touch six different aspects of competence, plus a question on real-life consequences. While the exact names of the six aspects can vary by concept, they typically include:

  • Understanding the situation and data structure
  • Choosing an appropriate approach / tools
  • Executing formulas or operations correctly
  • Checking and verifying results
  • Generalising the idea to new cases
  • Reflecting on what was learned and how it affects future problems

The course explicitly asks: “Have you practiced each of these aspects enough before moving on to the next scenario?”


5. Learning modes: groups, pairs, and individuals

5.1 Rigor groups

In the first week, learners work on easier practice tasks while the course team quietly observes:

  • Comparative grasp of basics
  • Interest levels and pace

Based on this, learners are sorted into rigor groups – groups with similar grasp and intent so that people are not held back or pressured by extremes in either direction.

Each rigor group then has its own:

  • Suggested starting point
  • Recommended pace
  • Level of external support

5.2 Mandatory pairs

The course requires that:

  • Learners work in fixed pairs on the same problems each week.
  • There are mandatory common hours where pairs sit together (physically or online) to discuss their attempts, confusion points, and insights.

Pairs serve multiple functions:

  • Clarifying reasoning by explaining to another person
  • Detecting conceptual gaps when partners disagree
  • Building accountability and making the work less lonely

Tasks that both partners are unsure about are flagged for mentor attention and often become the basis for the next common session.


6. Sync and async time

The course deliberately separates synchronous and asynchronous work.

6.1 Asynchronous (self-paced) work

  • The entire task list is open from the start.
  • Learners can move back and forth between easier and harder tasks based on:
    • A feeling of boredom (“jump ahead”)
    • A feeling of overwhelm (“step back to prerequisites”)
  • They are encouraged to use any tools they wish – pen-paper, calculator, Sheets, etc. – as long as they can explain the reasoning behind their answers.

A simple score map helps them see:

  • How many sets (and which levels) they’ve completed per concept
  • Which components of the course remain to be done

6.2 Synchronous (common) sessions

Common sessions are always structured as two separate sections:

  1. Introduction (≈30 minutes)
    • Focus on “Why are we learning this?”
    • Connect tasks to core ideas of problem-solving, data analysis, or statistics.
  2. Support (≈30–60 minutes)
    • Focus on “How can we solve this?”
    • Address hard conceptual clarity issues, mindset barriers, and hints for next steps.

Concept introduction and support are never mixed into one fuzzy session. This separation is a design choice to keep purpose and technique distinct.


7. Role of mentors

Mentors are not there to spoon-feed solutions; they are there to build independence.

Key responsibilities:

  • Observe closely how learners attempt tasks, not just whether they get answers right.
  • Break tasks into smaller steps when someone is struggling, and reassign those micro-tasks in the following week.
  • Facilitate conceptual clarity through “Why?” and “How?” questions and by modelling inquiry and humility.
  • Adjust the plan based on emerging prerequisite gaps noted by groups, pairs, or individuals.

The time investment is front-loaded in conceptual clarity:

  • If early time is spent on deep understanding, later self-learning capacity compounds and mentor effort per learner can decrease.
  • If early time is spent only on “doing tasks”, mentor effort stays constant or even increases over time.

8. Role of learners

Learners are expected to:

  • Be honest about how they feel: bored, stuck, or comfortable.
  • Use the boredom/overwhelm signals to decide when to jump ahead or step back.
  • Practice patience and humility with the process, not just chase completion.
  • Take ownership of their failed attempts:
    • Save and submit failed formulas in a separate sheet.
    • Note what they tried, why it failed, and what they learned from it.
  • Maintain a self-reflection dashboard indicating:
    • How many tasks they’ve solved for each aspect of a problem scenario
    • Whether they have enough exposure and consolidation to move ahead

There is an explicit focus on building:

  • Self-evaluation ability
  • Self-teaching ability
  • Problem-solving ability

These are treated as equal in importance to formula skills.


9. Use of GPT / AI

The course intentionally integrates GPT as a support tool, not as an answer machine.

Planned uses:

  • A specialised prompt that, given:
    • A description of the problem where a learner struggled
    • Optional description of which aspect they struggled with
    • Access to the prerequisites database
    …generates very basic prerequisite challenges for them to practice.
  • Using GPT to help design and refine prompts for generating new problem variations.
  • Using GPT to generate additional puzzles, edge cases, and structured learning problems that can then be curated and placed into the Foundational / Advanced / Expert stages.

GPT is, in short, a problem generator and design assistant, not the main teacher.


10. What’s new in “Data with Sheets 3.0” (vs 2.0)

Compared to the previous version of the course, the 3.0 design adds or sharpens:

  • Problems shared as whole sets per concept, already ordered from well-defined to ill-defined.
  • Clear mastery level structure (Foundation, Advanced, Expert) inside each concept.
  • Explicit guidance that a single set is a long-term task (up to two months), and that learners must prioritise understanding over speed.
  • Stronger emphasis on:
    • Saving and celebrating failed formulas and approaches
    • Documenting insights and verification, not just final answers
    • Self-evaluation techniques and confidence building
  • Mandatory pair-work and common peer hours, with mentor time mainly reserved for:
    • Most challenging conceptual problems
    • Mindset barriers
    • Planning next steps

Overall, 3.0 is less about “finishing a course” and more about growing into someone who can attack messy data problems independently.


11. How progress and success are measured

Success is not measured only by “completion of tasks”, but by observable changes in:

  • Ability to engage with longer, more unstructured problems without giving up.
  • Breadth of concepts covered to a reasonable foundational level.
  • Depth of conceptual clarity (as seen in explanations, peer discussions, and self-reflections).
  • Quality of insights about data and decisions, not just correctness of formulas.
  • Growth in self-evaluation, self-teaching, and disciplined use of boredom/overwhelm signals.

Regular check-ins are built into the design so that both mentors and learners can see movement and remain motivated.


If you’d like, the next step can be to turn this wiki article into:

  • A one-page external explainer for potential partners/funders, or
  • An internal “facilitator handbook” version with concrete examples, templates for dashboards, and sample problem scenarios.