Revised source ingestion, MIT OCW example

This commit is contained in:
welsberr 2026-03-16 13:54:07 -04:00
parent 5969a932d3
commit eb766ac25f
55 changed files with 4375 additions and 936 deletions

View File

@ -29,6 +29,7 @@ Then open:
- `examples/ocw-information-entropy-run/learner_progress.html` - `examples/ocw-information-entropy-run/learner_progress.html`
- `examples/ocw-information-entropy-skill-demo/skill_demo.md` - `examples/ocw-information-entropy-skill-demo/skill_demo.md`
- `examples/ocw-information-entropy-rolemesh-transcript/rolemesh_transcript.md`
- `skills/ocw-information-entropy-agent/` - `skills/ocw-information-entropy-agent/`
That gives you: That gives you:
@ -37,6 +38,7 @@ That gives you:
- a visible learning path - a visible learning path
- progress artifacts - progress artifacts
- a reusable skill grounded in the exported knowledge - a reusable skill grounded in the exported knowledge
- a transcript showing how a local-LLM-backed learner/mentor interaction can look
The point is not to replace your effort. The point is to give your effort structure, feedback, and momentum. The point is not to replace your effort. The point is to give your effort structure, feedback, and momentum.
@ -197,6 +199,7 @@ Primary outputs:
- `review_report.md` - `review_report.md`
- `conflict_report.md` - `conflict_report.md`
- `license_attribution.json` - `license_attribution.json`
- `pack_compliance_manifest.json` when a source inventory is provided
### 2. Review and workspace management ### 2. Review and workspace management
@ -264,7 +267,24 @@ Key capabilities:
- markdown capability reports - markdown capability reports
- artifact manifests - artifact manifests
### 5. Agentic learner demos and visualization ### 5. Local model integration
Didactopus can now target a RoleMesh Gateway-backed local LLM setup through its `ModelProvider` abstraction.
Main modules:
- `didactopus.model_provider`
- `didactopus.role_prompts`
- `didactopus.rolemesh_demo`
What this enables:
- role-based local model routing
- separate mentor/practice/project-advisor/evaluator prompts
- local heterogeneous compute usage through an OpenAI-compatible gateway
- a clean path to keep tutoring assistance structured instead of offloading learner work
### 6. Agentic learner demos and visualization
The repository includes deterministic agentic demos rather than a live external model integration. The repository includes deterministic agentic demos rather than a live external model integration.
@ -306,6 +326,43 @@ This writes:
- `examples/ocw-information-entropy-run/` - `examples/ocw-information-entropy-run/`
- `skills/ocw-information-entropy-agent/` - `skills/ocw-information-entropy-agent/`
The generated MIT OCW pack also includes:
- `license_attribution.json`
- `pack_compliance_manifest.json`
- `source_inventory.yaml`
### Try the local RoleMesh integration path
Stubbed local-provider demo:
```bash
python -m didactopus.rolemesh_demo --config configs/config.example.yaml
```
RoleMesh-backed example config:
```bash
python -m didactopus.rolemesh_demo --config configs/config.rolemesh.example.yaml
```
MIT OCW learner transcript through the local-LLM path:
```bash
python -m didactopus.ocw_rolemesh_transcript_demo --config configs/config.rolemesh.example.yaml
```
If your local models are slow, Didactopus now prints pending-status lines while each mentor, practice, learner, or evaluator turn is being generated. For a long manual run, capture both the transcript payload and those live status messages:
```bash
python -u -m didactopus.ocw_rolemesh_transcript_demo \
--config configs/config.rolemesh.example.yaml \
--out-dir examples/ocw-information-entropy-rolemesh-transcript \
2>&1 | tee examples/ocw-information-entropy-rolemesh-transcript/manual-run.log
```
That command leaves the final transcript in `rolemesh_transcript.md` and `rolemesh_transcript.json`, while `manual-run.log` preserves the conversational “working on it” notices during the wait.
### Render learner progress visualizations ### Render learner progress visualizations
Path-focused view: Path-focused view:
@ -357,6 +414,8 @@ What remains heuristic or lightweight:
- [docs/mastery-ledger.md](docs/mastery-ledger.md) - [docs/mastery-ledger.md](docs/mastery-ledger.md)
- [docs/workspace-manager.md](docs/workspace-manager.md) - [docs/workspace-manager.md](docs/workspace-manager.md)
- [docs/interactive-review-ui.md](docs/interactive-review-ui.md) - [docs/interactive-review-ui.md](docs/interactive-review-ui.md)
- [docs/mit-ocw-course-guide.md](docs/mit-ocw-course-guide.md)
- [docs/rolemesh-integration.md](docs/rolemesh-integration.md)
- [docs/faq.md](docs/faq.md) - [docs/faq.md](docs/faq.md)
## MIT OCW Demo Notes ## MIT OCW Demo Notes

View File

@ -6,3 +6,8 @@ bridge:
port: 8765 port: 8765
registry_path: "workspace_registry.json" registry_path: "workspace_registry.json"
default_workspace_root: "workspaces" default_workspace_root: "workspaces"
model_provider:
provider: "stub"
local:
backend: "stub"
model_name: "local-demo"

View File

@ -0,0 +1,26 @@
review:
default_reviewer: "Wesley R. Elsberry"
write_promoted_pack: true
bridge:
host: "127.0.0.1"
port: 8765
registry_path: "workspace_registry.json"
default_workspace_root: "workspaces"
model_provider:
provider: "rolemesh"
local:
backend: "stub"
model_name: "unused-when-rolemesh-enabled"
rolemesh:
base_url: "http://127.0.0.1:8000"
api_key: "change-me-client-key-1"
default_model: "planner"
role_to_model:
mentor: "planner"
learner: "writer"
practice: "writer"
project_advisor: "planner"
evaluator: "reviewer"
timeout_seconds: 30.0

View File

@ -50,6 +50,11 @@ The pack emitter writes:
- `review_report.md` - `review_report.md`
- `conflict_report.md` - `conflict_report.md`
- `license_attribution.json` - `license_attribution.json`
- `source_corpus.json`
`source_corpus.json` is the main grounded-text artifact. It preserves lesson bodies, objectives,
exercises, and source references from the ingested material so downstream tutoring or evaluation
can rely on source-derived text instead of only the distilled concept graph.
## Rule layer ## Rule layer
@ -77,4 +82,4 @@ The end-to-end reference flow in this repository is:
python -m didactopus.ocw_information_entropy_demo python -m didactopus.ocw_information_entropy_demo
``` ```
That command ingests the MIT OCW Information and Entropy source file in `examples/ocw-information-entropy/`, emits a draft pack into `domain-packs/mit-ocw-information-entropy/`, runs a deterministic agentic learner over the generated path, and writes downstream skill/visualization artifacts. That command ingests the MIT OCW Information and Entropy source file or directory tree in `examples/ocw-information-entropy/`, emits a draft pack into `domain-packs/mit-ocw-information-entropy/`, writes a grounded `source_corpus.json`, runs a deterministic agentic learner over the generated path, and writes downstream skill/visualization artifacts.

View File

@ -0,0 +1,111 @@
# RoleMesh Integration
RoleMesh Gateway is an appropriate dependency for local-LLM-backed Didactopus usage.
## Why it fits
The local RoleMesh codebase provides exactly the main things Didactopus needs for a local heterogeneous inference setup:
- OpenAI-compatible `POST /v1/chat/completions`
- role-based model routing
- local or multi-host upstream registration
- flexible model loading and switching through the gateway/node-agent split
- per-role defaults for temperature and other request settings
That means Didactopus can keep a simple provider abstraction while delegating model placement and routing to RoleMesh.
## Recommended architecture
1. Run RoleMesh Gateway as the OpenAI-compatible front door.
2. Point RoleMesh roles at local backends or discovered node agents.
3. Configure Didactopus to use the `rolemesh` model provider.
4. Let Didactopus send mentor/practice/project-advisor/evaluator requests by role.
## Didactopus-side config
Use `configs/config.rolemesh.example.yaml` as the starting point.
The important fields are:
- `model_provider.provider: rolemesh`
- `model_provider.rolemesh.base_url`
- `model_provider.rolemesh.api_key`
- `model_provider.rolemesh.default_model`
- `model_provider.rolemesh.role_to_model`
## Suggested role mapping
With the sample RoleMesh gateway config, this is a good default mapping:
- `mentor -> planner`
- `practice -> writer`
- `project_advisor -> planner`
- `evaluator -> reviewer`
This keeps Didactopus prompts aligned with the role semantics RoleMesh already exposes.
## Prompt layer
Didactopus now keeps its default RoleMesh-oriented prompts in:
- `didactopus.role_prompts`
These prompts are intentionally anti-offloading:
- mentor mode prefers Socratic questions and hints
- practice mode prefers reasoning-heavy tasks
- project-advisor mode prefers synthesis work
- evaluator mode prefers critique and explicit limitations
## Demo command
To exercise the integration path without a live RoleMesh gateway, run:
```bash
python -m didactopus.rolemesh_demo --config configs/config.example.yaml
```
That uses the stub provider path.
To point at a live RoleMesh deployment, start from:
```bash
python -m didactopus.rolemesh_demo --config configs/config.rolemesh.example.yaml
```
and replace the placeholder gateway URL/API key with your real local setup.
## Example transcript
The repository now includes a generated transcript of an AI learner using the local-LLM path to approach the MIT OCW Information and Entropy course:
- `examples/ocw-information-entropy-rolemesh-transcript/rolemesh_transcript.md`
Generator command:
```bash
python -m didactopus.ocw_rolemesh_transcript_demo --config configs/config.rolemesh.example.yaml
```
If some RoleMesh aliases are unhealthy, the transcript demo automatically falls back to the healthy local alias and records that in the output metadata.
If local inference is slow, the transcript demo now emits pending notices such as “Didactopus is evaluating the work before replying” while each turn is still running. For a full manual capture, run:
```bash
python -u -m didactopus.ocw_rolemesh_transcript_demo \
--config configs/config.rolemesh.example.yaml \
--out-dir examples/ocw-information-entropy-rolemesh-transcript \
2>&1 | tee examples/ocw-information-entropy-rolemesh-transcript/manual-run.log
```
That gives you three artifacts:
- `rolemesh_transcript.json`
- `rolemesh_transcript.md`
- `manual-run.log` with the live “pending” status messages
For slower larger models, expect the transcript run to take several minutes rather than seconds. The command above is the recommended way to capture the whole session outside Codex.
## Gateway-side note
This repository does not vendor RoleMesh. It assumes the local RoleMesh codebase or deployment exists separately. The reference local codebase mentioned by the user is suitable because it already provides the API and routing semantics Didactopus needs.

View File

@ -1,54 +1,203 @@
concepts: concepts:
- id: mit-ocw-6-050j-information-and-entropy - id: mit-ocw-6-050j-information-and-entropy-course-home
title: MIT OCW 6.050J Information and Entropy title: 'MIT OCW 6.050J Information and Entropy: Course Home'
description: 'Source: MIT OpenCourseWare 6.050J Information and Entropy, Spring description: 'Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/
2008.
Attribution: adapted from the OCW course overview, unit sequence, and assigned Attribution: adapted from the MIT OpenCourseWare course home page for 6.050J Information
textbook references.' and Entropy.'
prerequisites: [] prerequisites: []
mastery_signals: [] mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: information - id: information-and-entropy
title: Information title: Information and Entropy
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information description: '- Objective: Identify the course title, instructors, departments,
and Entropy'. level, and major topical areas.
- Exercise: Summarize the course in one paragraph for a prospective learner.
MIT OpenCourseWare presents 6.050J Information and Entropy as a S'
prerequisites:
- mit-ocw-6-050j-information-and-entropy-course-home
mastery_signals:
- Identify the course title, instructors, departments, level, and major topical
areas.
mastery_profile: {}
- id: paul
title: Paul
description: Candidate concept extracted from lesson 'Information and Entropy'.
prerequisites: [] prerequisites: []
mastery_signals: [] mastery_signals:
- Identify the course title, instructors, departments, level, and major topical
areas.
mastery_profile: {}
- id: penfield
title: Penfield
description: Candidate concept extracted from lesson 'Information and Entropy'.
prerequisites: []
mastery_signals:
- Identify the course title, instructors, departments, level, and major topical
areas.
mastery_profile: {}
- id: seth
title: Seth
description: Candidate concept extracted from lesson 'Information and Entropy'.
prerequisites: []
mastery_signals:
- Identify the course title, instructors, departments, level, and major topical
areas.
mastery_profile: {}
- id: lloyd
title: Lloyd
description: Candidate concept extracted from lesson 'Information and Entropy'.
prerequisites: []
mastery_signals:
- Identify the course title, instructors, departments, level, and major topical
areas.
mastery_profile: {}
- id: electrical
title: Electrical
description: Candidate concept extracted from lesson 'Information and Entropy'.
prerequisites: []
mastery_signals:
- Identify the course title, instructors, departments, level, and major topical
areas.
mastery_profile: {}
- id: engineering
title: Engineering
description: Candidate concept extracted from lesson 'Information and Entropy'.
prerequisites: []
mastery_signals:
- Identify the course title, instructors, departments, level, and major topical
areas.
mastery_profile: {}
- id: ultimate-limits-to-communication-and-computation
title: Ultimate Limits to Communication and Computation
description: '- Objective: Explain the broad intellectual scope of the course.
- Exercise: List the main topic clusters that connect communication, computation,
and entropy.
The course examines the ultimate limits to communication and computation with
em'
prerequisites:
- information-and-entropy
mastery_signals:
- Explain the broad intellectual scope of the course.
mastery_profile: {} mastery_profile: {}
- id: entropy - id: entropy
title: Entropy title: Entropy
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information description: Candidate concept extracted from lesson 'Ultimate Limits to Communication
and Entropy'. and Computation'.
prerequisites: [] prerequisites: []
mastery_signals:
- Explain the broad intellectual scope of the course.
mastery_profile: {}
- id: open-textbooks-problem-sets-and-programming-work
title: Open Textbooks, Problem Sets, and Programming Work
description: '- Objective: Identify the main kinds of learning resources supplied
through the course.
- Exercise: Explain how these resource types support both conceptual study and
practice.
The course home lists open textbooks, problem sets, problem set'
prerequisites:
- ultimate-limits-to-communication-and-computation
mastery_signals:
- Identify the main kinds of learning resources supplied through the course.
mastery_profile: {}
- id: mit-ocw-6-050j-information-and-entropy-syllabus
title: 'MIT OCW 6.050J Information and Entropy: Syllabus'
description: 'Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/
Attribution: adapted from the MIT OpenCourseWare syllabus page for 6.050J Information
and Entropy.'
prerequisites:
- open-textbooks-problem-sets-and-programming-work
mastery_signals: [] mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: source - id: prerequisites-and-mathematical-background
title: Source title: Prerequisites and Mathematical Background
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information description: '- Objective: Explain the mathematical maturity expected by the course.
and Entropy'.
prerequisites: [] - Exercise: Decide whether a learner needs review in probability, linear algebra,
mastery_signals: [] or signals before beginning.
The syllabus expects a foundation comparable to MIT subjec'
prerequisites:
- mit-ocw-6-050j-information-and-entropy-syllabus
mastery_signals:
- Explain the mathematical maturity expected by the course.
mastery_profile: {} mastery_profile: {}
- id: opencourseware - id: assessment-structure
title: OpenCourseWare title: Assessment Structure
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information description: '- Objective: Identify the role of problem sets, exams, and programming
and Entropy'. work in the course.
prerequisites: []
mastery_signals: [] - Exercise: Build a study schedule that alternates reading, derivation, and worked
exercises.
The syllabus emphasizes regular problem solving and qua'
prerequisites:
- prerequisites-and-mathematical-background
mastery_signals:
- Identify the role of problem sets, exams, and programming work in the course.
mastery_profile: {} mastery_profile: {}
- id: spring - id: course-notes-and-reference-texts
title: Spring title: Course Notes and Reference Texts
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information description: '- Objective: Explain how the course notes and textbook references
and Entropy'. supply the core conceptual sequence.
prerequisites: []
mastery_signals: [] - Exercise: Compare when to use course notes versus outside references for clarification.
MIT OCW links course notes and textbook-style r'
prerequisites:
- assessment-structure
mastery_signals:
- Explain how the course notes and textbook references supply the core conceptual
sequence.
mastery_profile: {} mastery_profile: {}
- id: attribution - id: independent-reasoning-and-careful-comparison
title: Attribution title: Independent Reasoning and Careful Comparison
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information description: '- Objective: Explain why the course requires precise comparison of
and Entropy'. related but non-identical concepts.
- Exercise: Write a short note distinguishing Shannon entropy, channel capacity,
and thermodynamic entropy.
The syllabus framing implies'
prerequisites:
- course-notes-and-reference-texts
mastery_signals:
- Explain why the course requires precise comparison of related but non-identical
concepts.
mastery_profile: {}
- id: shannon
title: Shannon
description: Candidate concept extracted from lesson 'Independent Reasoning and
Careful Comparison'.
prerequisites: [] prerequisites: []
mastery_signals:
- Explain why the course requires precise comparison of related but non-identical
concepts.
mastery_profile: {}
- id: learners
title: Learners
description: Candidate concept extracted from lesson 'Independent Reasoning and
Careful Comparison'.
prerequisites: []
mastery_signals:
- Explain why the course requires precise comparison of related but non-identical
concepts.
mastery_profile: {}
- id: mit-ocw-6-050j-information-and-entropy-unit-sequence
title: 'MIT OCW 6.050J Information and Entropy: Unit Sequence'
description: 'Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/
Attribution: adapted from the MIT OpenCourseWare unit progression and resource
organization for 6.050J Information and Entropy.'
prerequisites:
- independent-reasoning-and-careful-comparison
mastery_signals: [] mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: counting-and-probability - id: counting-and-probability
@ -59,284 +208,132 @@ concepts:
- Exercise: Derive a simple counting argument for binary strings and compute an - Exercise: Derive a simple counting argument for binary strings and compute an
event probability. event probability.
This lesson i' Early units e'
prerequisites: prerequisites:
- mit-ocw-6-050j-information-and-entropy - mit-ocw-6-050j-information-and-entropy-unit-sequence
mastery_signals: [] mastery_signals:
mastery_profile: {} - Explain how counting arguments, probability spaces, and random variables support
- id: counting later information-theory results.
title: Counting
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: probability
title: Probability
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: objective
title: Objective
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: explain
title: Explain
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: exercise
title: Exercise
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: derive - id: derive
title: Derive title: Derive
description: Candidate concept extracted from lesson 'Counting and Probability'. description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: [] prerequisites: []
mastery_signals: [] mastery_signals:
mastery_profile: {} - Explain how counting arguments, probability spaces, and random variables support
- id: this later information-theory results.
title: This
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: random
title: Random
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: shannon-entropy - id: shannon-entropy
title: Shannon Entropy title: Shannon Entropy
description: '- Objective: Explain Shannon Entropy as a measure of uncertainty and description: '- Objective: Explain Shannon entropy as a measure of uncertainty and
compare high-entropy and low-entropy sources. compare high-entropy and low-entropy sources.
- Exercise: Compute the entropy of a Bernoulli source and interpret the result. - Exercise: Compute the entropy of a Bernoulli source and interpret the result.
This lesson centers Shannon Entropy, Surprise' The course then introduces entropy as a quant'
prerequisites: prerequisites:
- counting-and-probability - counting-and-probability
mastery_signals: [] mastery_signals:
mastery_profile: {} - Explain Shannon entropy as a measure of uncertainty and compare high-entropy and
- id: shannon low-entropy sources.
title: Shannon
description: Candidate concept extracted from lesson 'Shannon Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: compute
title: Compute
description: Candidate concept extracted from lesson 'Shannon Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: bernoulli - id: bernoulli
title: Bernoulli title: Bernoulli
description: Candidate concept extracted from lesson 'Shannon Entropy'. description: Candidate concept extracted from lesson 'Shannon Entropy'.
prerequisites: [] prerequisites: []
mastery_signals: [] mastery_signals:
- Explain Shannon entropy as a measure of uncertainty and compare high-entropy and
low-entropy sources.
mastery_profile: {} mastery_profile: {}
- id: mutual-information - id: mutual-information
title: Mutual Information title: Mutual Information
description: '- Objective: Explain Mutual Information and relate it to dependence description: '- Objective: Explain mutual information and relate it to dependence
between signals. between signals or observations.
- Exercise: Compare independent variables with dependent variables using mutual-information - Exercise: Compare independent variables with dependent variables using mutual-information
reasoning. reasoning.
This lesson introduces Mutual Information, Dependenc' These units ask the learner to under'
prerequisites: prerequisites:
- shannon-entropy - shannon-entropy
mastery_signals: [] mastery_signals:
- Explain mutual information and relate it to dependence between signals or observations.
mastery_profile: {} mastery_profile: {}
- id: mutual - id: source-coding-and-compression
title: Mutual title: Source Coding and Compression
description: Candidate concept extracted from lesson 'Mutual Information'. description: '- Objective: Explain lossless compression in terms of entropy, redundancy,
prerequisites: [] and coding choices.
mastery_signals: []
mastery_profile: {}
- id: compare
title: Compare
description: Candidate concept extracted from lesson 'Mutual Information'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: dependence
title: Dependence
description: Candidate concept extracted from lesson 'Mutual Information'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: data-compression
title: Data Compression
description: '- Objective: Explain lossless compression in terms of entropy and
typical structure.
- Exercise: Describe when compression succeeds and when it fails on already-random - Exercise: Describe when compression succeeds and when it fails on already-random
data. data.
This lesson covers Data Compression, Redundancy, and Efficient Rep' The course develops the idea that structured sources can'
prerequisites: prerequisites:
- mutual-information - mutual-information
mastery_signals: [] mastery_signals:
mastery_profile: {} - Explain lossless compression in terms of entropy, redundancy, and coding choices.
- id: data
title: Data
description: Candidate concept extracted from lesson 'Data Compression'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: compression
title: Compression
description: Candidate concept extracted from lesson 'Data Compression'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: describe
title: Describe
description: Candidate concept extracted from lesson 'Data Compression'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: redundancy
title: Redundancy
description: Candidate concept extracted from lesson 'Data Compression'.
prerequisites: []
mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: huffman-coding - id: huffman-coding
title: Huffman Coding title: Huffman Coding
description: '- Objective: Explain Huffman Coding and justify why shorter codewords description: '- Objective: Explain Huffman coding and justify why likely symbols
should track more likely symbols. receive shorter descriptions.
- Exercise: Build a Huffman code for a small source alphabet. - Exercise: Build a Huffman code for a small source alphabet.
This lesson focuses on Huffman Coding, Prefix Codes, and Expected Length.' Learners use trees and expected length arguments to connect probability models
to'
prerequisites: prerequisites:
- data-compression - source-coding-and-compression
mastery_signals: [] mastery_signals:
mastery_profile: {} - Explain Huffman coding and justify why likely symbols receive shorter descriptions.
- id: huffman
title: Huffman
description: Candidate concept extracted from lesson 'Huffman Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: coding
title: Coding
description: Candidate concept extracted from lesson 'Huffman Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: build
title: Build
description: Candidate concept extracted from lesson 'Huffman Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: prefix
title: Prefix
description: Candidate concept extracted from lesson 'Huffman Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: channel-capacity - id: channel-capacity
title: Channel Capacity title: Channel Capacity
description: '- Objective: Explain Channel Capacity as a limit on reliable communication description: '- Objective: Explain channel capacity as a limit on reliable communication
over noisy channels. over a noisy channel.
- Exercise: State why reliable transmission above capacity is impossible in the - Exercise: State why reliable transmission above capacity is impossible in the
long run. long run.
This lesson develops Channel Capacity, Reliable Commun' The course treats capacity as a fundamental upper bou'
prerequisites: prerequisites:
- huffman-coding - huffman-coding
mastery_signals: [] mastery_signals:
mastery_profile: {} - Explain channel capacity as a limit on reliable communication over a noisy channel.
- id: channel
title: Channel
description: Candidate concept extracted from lesson 'Channel Capacity'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: capacity
title: Capacity
description: Candidate concept extracted from lesson 'Channel Capacity'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: state
title: State
description: Candidate concept extracted from lesson 'Channel Capacity'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: reliable
title: Reliable
description: Candidate concept extracted from lesson 'Channel Capacity'.
prerequisites: []
mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: channel-coding - id: channel-coding
title: Channel Coding title: Channel Coding
description: '- Objective: Explain how Channel Coding adds structure that protects description: '- Objective: Explain how channel coding adds redundancy to protect
messages against noise. messages from noise.
- Exercise: Contrast uncoded transmission with coded transmission on a noisy channel. - Exercise: Contrast uncoded transmission with coded transmission on a noisy channel.
This lesson connects Channel Coding, Decoding, and Reliabilit' These units emphasize that redundancy can be wasteful in compressi'
prerequisites: prerequisites:
- channel-capacity - channel-capacity
mastery_signals: [] mastery_signals:
- Explain how channel coding adds redundancy to protect messages from noise.
mastery_profile: {} mastery_profile: {}
- id: contrast - id: contrast
title: Contrast title: Contrast
description: Candidate concept extracted from lesson 'Channel Coding'. description: Candidate concept extracted from lesson 'Channel Coding'.
prerequisites: [] prerequisites: []
mastery_signals: [] mastery_signals:
mastery_profile: {} - Explain how channel coding adds redundancy to protect messages from noise.
- id: decoding
title: Decoding
description: Candidate concept extracted from lesson 'Channel Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: error-correcting-codes - id: error-correcting-codes
title: Error Correcting Codes title: Error Correcting Codes
description: '- Objective: Explain how Error Correcting Codes detect or correct description: '- Objective: Explain how error-correcting codes detect or repair corrupted
symbol corruption. symbols.
- Exercise: Describe a simple parity-style code and its limits. - Exercise: Describe a simple parity-style code and its limits.
This lesson covers Error Correcting Codes, Parity, and Syndrome-style reasoning. The learner must connect abstract limits to concrete coding mechanisms and understand
The learne' both s'
prerequisites: prerequisites:
- channel-coding - channel-coding
mastery_signals: [] mastery_signals:
mastery_profile: {} - Explain how error-correcting codes detect or repair corrupted symbols.
- id: error
title: Error
description: Candidate concept extracted from lesson 'Error Correcting Codes'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: correcting
title: Correcting
description: Candidate concept extracted from lesson 'Error Correcting Codes'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: codes
title: Codes
description: Candidate concept extracted from lesson 'Error Correcting Codes'.
prerequisites: []
mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: cryptography-and-information-hiding - id: cryptography-and-information-hiding
title: Cryptography and Information Hiding title: Cryptography and Information Hiding
@ -345,24 +342,11 @@ concepts:
- Exercise: Compare a secure scheme with a weak one in terms of revealed information. - Exercise: Compare a secure scheme with a weak one in terms of revealed information.
This lesson combines Cryptography, Information Leakag' The course extends information-theoretic reasoning to'
prerequisites: prerequisites:
- error-correcting-codes - error-correcting-codes
mastery_signals: [] mastery_signals:
mastery_profile: {} - Explain the relationship between secrecy, information leakage, and coded communication.
- id: cryptography
title: Cryptography
description: Candidate concept extracted from lesson 'Cryptography and Information
Hiding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: hiding
title: Hiding
description: Candidate concept extracted from lesson 'Cryptography and Information
Hiding'.
prerequisites: []
mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: thermodynamics-and-entropy - id: thermodynamics-and-entropy
title: Thermodynamics and Entropy title: Thermodynamics and Entropy
@ -372,49 +356,37 @@ concepts:
- Exercise: Compare the two entropy notions and identify what is preserved across - Exercise: Compare the two entropy notions and identify what is preserved across
the analogy. the analogy.
This lesson connects Thermodynamics, Entropy, and P' The course uses entropy as a bridge concept between'
prerequisites: prerequisites:
- cryptography-and-information-hiding - cryptography-and-information-hiding
mastery_signals: [] mastery_signals:
- Explain how thermodynamic entropy relates to, and differs from, Shannon entropy.
mastery_profile: {} mastery_profile: {}
- id: thermodynamics - id: reversible-computation-and-quantum-computation
title: Thermodynamics title: Reversible Computation and Quantum Computation
description: Candidate concept extracted from lesson 'Thermodynamics and Entropy'. description: '- Objective: Explain why the physical implementation of computation
prerequisites: [] matters for information processing limits.
mastery_signals: []
- Exercise: Summarize how reversible computation changes the discussion of dissipation
and information loss.
Later units connect'
prerequisites:
- thermodynamics-and-entropy
mastery_signals:
- Explain why the physical implementation of computation matters for information
processing limits.
mastery_profile: {} mastery_profile: {}
- id: course-synthesis - id: course-synthesis
title: Course Synthesis title: Course Synthesis
description: '- Objective: Synthesize the course by connecting entropy, coding, description: '- Objective: Synthesize the course by connecting entropy, coding,
reliability, and physical interpretation in one coherent narrative. reliability, secrecy, and physical interpretation in one coherent narrative.
- Exercise: Produce a final study guide that links source coding, channel coding, - Exercise: Produce a final study guide that links source coding, channel coding,
secrecy, and thermodynam' secrecy, thermo'
prerequisites: prerequisites:
- thermodynamics-and-entropy - reversible-computation-and-quantum-computation
mastery_signals: [] mastery_signals:
mastery_profile: {} - Synthesize the course by connecting entropy, coding, reliability, secrecy, and
- id: course physical interpretation in one coherent narrative.
title: Course
description: Candidate concept extracted from lesson 'Course Synthesis'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: synthesis
title: Synthesis
description: Candidate concept extracted from lesson 'Course Synthesis'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: synthesize
title: Synthesize
description: Candidate concept extracted from lesson 'Course Synthesis'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: produce
title: Produce
description: Candidate concept extracted from lesson 'Course Synthesis'.
prerequisites: []
mastery_signals: []
mastery_profile: {} mastery_profile: {}

View File

@ -2,9 +2,19 @@
"rights_note": "Derived from MIT OpenCourseWare 6.050J Information and Entropy (Spring 2008). Retain MIT OCW attribution and applicable Creative Commons terms before redistribution.", "rights_note": "Derived from MIT OpenCourseWare 6.050J Information and Entropy (Spring 2008). Retain MIT OCW attribution and applicable Creative Commons terms before redistribution.",
"sources": [ "sources": [
{ {
"source_path": "/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy/6-050j-information-and-entropy.md", "source_path": "examples/ocw-information-entropy/course/course-home.md",
"source_type": "markdown", "source_type": "markdown",
"title": "6 050J Information And Entropy" "title": "Course Home"
},
{
"source_path": "examples/ocw-information-entropy/course/syllabus.md",
"source_type": "markdown",
"title": "Syllabus"
},
{
"source_path": "examples/ocw-information-entropy/course/unit-sequence.md",
"source_type": "markdown",
"title": "Unit Sequence"
} }
] ]
} }

View File

@ -12,3 +12,5 @@ dependencies: []
overrides: [] overrides: []
profile_templates: {} profile_templates: {}
cross_pack_links: [] cross_pack_links: []
supporting_artifacts:
- source_corpus.json

View File

@ -3,6 +3,7 @@
"display_name": "MIT OCW Information and Entropy", "display_name": "MIT OCW Information and Entropy",
"derived_from_sources": [ "derived_from_sources": [
"mit-ocw-6-050j-course-home", "mit-ocw-6-050j-course-home",
"mit-ocw-6-050j-syllabus",
"mit-ocw-6-050j-unit-8-textbook", "mit-ocw-6-050j-unit-8-textbook",
"mit-ocw-6-050j-unit-13-textbook" "mit-ocw-6-050j-unit-13-textbook"
], ],

View File

@ -1,59 +1,5 @@
# Review Report # Review Report
- Module 'Imported from MARKDOWN' has no explicit exercises; mastery signals may be weak. - Concept 'MIT OCW 6.050J Information and Entropy: Course Home' has no extracted mastery signals; review manually.
- Concept 'MIT OCW 6.050J Information and Entropy' has no extracted mastery signals; review manually. - Concept 'MIT OCW 6.050J Information and Entropy: Syllabus' has no extracted mastery signals; review manually.
- Concept 'Information' has no extracted mastery signals; review manually. - Concept 'MIT OCW 6.050J Information and Entropy: Unit Sequence' has no extracted mastery signals; review manually.
- Concept 'Entropy' has no extracted mastery signals; review manually.
- Concept 'Source' has no extracted mastery signals; review manually.
- Concept 'OpenCourseWare' has no extracted mastery signals; review manually.
- Concept 'Spring' has no extracted mastery signals; review manually.
- Concept 'Attribution' has no extracted mastery signals; review manually.
- Concept 'Counting and Probability' has no extracted mastery signals; review manually.
- Concept 'Counting' has no extracted mastery signals; review manually.
- Concept 'Probability' has no extracted mastery signals; review manually.
- Concept 'Objective' has no extracted mastery signals; review manually.
- Concept 'Explain' has no extracted mastery signals; review manually.
- Concept 'Exercise' has no extracted mastery signals; review manually.
- Concept 'Derive' has no extracted mastery signals; review manually.
- Concept 'This' has no extracted mastery signals; review manually.
- Concept 'Random' has no extracted mastery signals; review manually.
- Concept 'Shannon Entropy' has no extracted mastery signals; review manually.
- Concept 'Shannon' has no extracted mastery signals; review manually.
- Concept 'Compute' has no extracted mastery signals; review manually.
- Concept 'Bernoulli' has no extracted mastery signals; review manually.
- Concept 'Mutual Information' has no extracted mastery signals; review manually.
- Concept 'Mutual' has no extracted mastery signals; review manually.
- Concept 'Compare' has no extracted mastery signals; review manually.
- Concept 'Dependence' has no extracted mastery signals; review manually.
- Concept 'Data Compression' has no extracted mastery signals; review manually.
- Concept 'Data' has no extracted mastery signals; review manually.
- Concept 'Compression' has no extracted mastery signals; review manually.
- Concept 'Describe' has no extracted mastery signals; review manually.
- Concept 'Redundancy' has no extracted mastery signals; review manually.
- Concept 'Huffman Coding' has no extracted mastery signals; review manually.
- Concept 'Huffman' has no extracted mastery signals; review manually.
- Concept 'Coding' has no extracted mastery signals; review manually.
- Concept 'Build' has no extracted mastery signals; review manually.
- Concept 'Prefix' has no extracted mastery signals; review manually.
- Concept 'Channel Capacity' has no extracted mastery signals; review manually.
- Concept 'Channel' has no extracted mastery signals; review manually.
- Concept 'Capacity' has no extracted mastery signals; review manually.
- Concept 'State' has no extracted mastery signals; review manually.
- Concept 'Reliable' has no extracted mastery signals; review manually.
- Concept 'Channel Coding' has no extracted mastery signals; review manually.
- Concept 'Contrast' has no extracted mastery signals; review manually.
- Concept 'Decoding' has no extracted mastery signals; review manually.
- Concept 'Error Correcting Codes' has no extracted mastery signals; review manually.
- Concept 'Error' has no extracted mastery signals; review manually.
- Concept 'Correcting' has no extracted mastery signals; review manually.
- Concept 'Codes' has no extracted mastery signals; review manually.
- Concept 'Cryptography and Information Hiding' has no extracted mastery signals; review manually.
- Concept 'Cryptography' has no extracted mastery signals; review manually.
- Concept 'Hiding' has no extracted mastery signals; review manually.
- Concept 'Thermodynamics and Entropy' has no extracted mastery signals; review manually.
- Concept 'Thermodynamics' has no extracted mastery signals; review manually.
- Concept 'Course Synthesis' has no extracted mastery signals; review manually.
- Concept 'Course' has no extracted mastery signals; review manually.
- Concept 'Synthesis' has no extracted mastery signals; review manually.
- Concept 'Synthesize' has no extracted mastery signals; review manually.
- Concept 'Produce' has no extracted mastery signals; review manually.

View File

@ -2,16 +2,50 @@ stages:
- id: stage-1 - id: stage-1
title: Imported from MARKDOWN title: Imported from MARKDOWN
concepts: concepts:
- mit-ocw-6-050j-information-and-entropy - mit-ocw-6-050j-information-and-entropy-course-home
- information-and-entropy
- ultimate-limits-to-communication-and-computation
- open-textbooks-problem-sets-and-programming-work
- mit-ocw-6-050j-information-and-entropy-syllabus
- prerequisites-and-mathematical-background
- assessment-structure
- course-notes-and-reference-texts
- independent-reasoning-and-careful-comparison
- mit-ocw-6-050j-information-and-entropy-unit-sequence
- counting-and-probability - counting-and-probability
- shannon-entropy - shannon-entropy
- mutual-information - mutual-information
- data-compression - source-coding-and-compression
- huffman-coding - huffman-coding
- channel-capacity - channel-capacity
- channel-coding - channel-coding
- error-correcting-codes - error-correcting-codes
- cryptography-and-information-hiding - cryptography-and-information-hiding
- thermodynamics-and-entropy - thermodynamics-and-entropy
- reversible-computation-and-quantum-computation
- course-synthesis - course-synthesis
checkpoint: [] checkpoint:
- Summarize the course in one paragraph for a prospective learner.
- List the main topic clusters that connect communication, computation, and entropy.
- Explain how these resource types support both conceptual study and practice.
- Decide whether a learner needs review in probability, linear algebra, or signals
before beginning.
- Build a study schedule that alternates reading, derivation, and worked exercises.
- Compare when to use course notes versus outside references for clarification.
- Write a short note distinguishing Shannon entropy, channel capacity, and thermodynamic
entropy.
- Derive a simple counting argument for binary strings and compute an event probability.
- Compute the entropy of a Bernoulli source and interpret the result.
- Compare independent variables with dependent variables using mutual-information
reasoning.
- Describe when compression succeeds and when it fails on already-random data.
- Build a Huffman code for a small source alphabet.
- State why reliable transmission above capacity is impossible in the long run.
- Contrast uncoded transmission with coded transmission on a noisy channel.
- Describe a simple parity-style code and its limits.
- Compare a secure scheme with a weak one in terms of revealed information.
- Compare the two entropy notions and identify what is preserved across the analogy.
- Summarize how reversible computation changes the discussion of dissipation and
information loss.
- Produce a final study guide that links source coding, channel coding, secrecy,
thermodynamic analogies, and computation.

View File

@ -0,0 +1,803 @@
{
"course_title": "MIT OCW Information and Entropy",
"rights_note": "Derived from MIT OpenCourseWare 6.050J Information and Entropy (Spring 2008). Retain MIT OCW attribution and applicable Creative Commons terms before redistribution.",
"sources": [
{
"source_path": "examples/ocw-information-entropy/course/course-home.md",
"source_type": "markdown",
"title": "Course Home",
"metadata": {}
},
{
"source_path": "examples/ocw-information-entropy/course/syllabus.md",
"source_type": "markdown",
"title": "Syllabus",
"metadata": {}
},
{
"source_path": "examples/ocw-information-entropy/course/unit-sequence.md",
"source_type": "markdown",
"title": "Unit Sequence",
"metadata": {}
}
],
"fragments": [
{
"fragment_id": "imported-from-markdown::mit-ocw-6.050j-information-and-entropy:-course-home::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "MIT OCW 6.050J Information and Entropy: Course Home",
"text": "Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/\nAttribution: adapted from the MIT OpenCourseWare course home page for 6.050J Information and Entropy.",
"source_refs": [
"examples/ocw-information-entropy/course/course-home.md"
],
"objectives": [],
"exercises": [],
"key_terms": [
"Information",
"Entropy"
]
},
{
"fragment_id": "imported-from-markdown::information-and-entropy::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Information and Entropy",
"text": "- Objective: Identify the course title, instructors, departments, level, and major topical areas.\n- Exercise: Summarize the course in one paragraph for a prospective learner.\nMIT OpenCourseWare presents 6.050J Information and Entropy as a Spring 2008 undergraduate subject taught by Paul Penfield and Seth Lloyd in Electrical Engineering and Computer Science together with Mechanical Engineering. The catalog framing emphasizes theory of computation, signal processing, and mathematical reasoning about information.",
"source_refs": [
"examples/ocw-information-entropy/course/course-home.md"
],
"objectives": [
"Identify the course title, instructors, departments, level, and major topical areas."
],
"exercises": [
"Summarize the course in one paragraph for a prospective learner."
],
"key_terms": [
"Information",
"Entropy",
"Paul",
"Penfield",
"Seth",
"Lloyd",
"Electrical",
"Engineering"
]
},
{
"fragment_id": "imported-from-markdown::information-and-entropy::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Information and Entropy",
"text": "Identify the course title, instructors, departments, level, and major topical areas.",
"source_refs": [
"examples/ocw-information-entropy/course/course-home.md"
]
},
{
"fragment_id": "imported-from-markdown::information-and-entropy::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Information and Entropy",
"text": "Summarize the course in one paragraph for a prospective learner.",
"source_refs": [
"examples/ocw-information-entropy/course/course-home.md"
]
},
{
"fragment_id": "imported-from-markdown::ultimate-limits-to-communication-and-computation::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Ultimate Limits to Communication and Computation",
"text": "- Objective: Explain the broad intellectual scope of the course.\n- Exercise: List the main topic clusters that connect communication, computation, and entropy.\nThe course examines the ultimate limits to communication and computation with emphasis on the physical nature of information processing. The source description highlights information and computation, digital signals, codes and compression, noise, probability, error correction, reversible and irreversible operations, physics of computation, and quantum computation. Entropy is explicitly connected both to channel capacity and to the second law of thermodynamics.",
"source_refs": [
"examples/ocw-information-entropy/course/course-home.md"
],
"objectives": [
"Explain the broad intellectual scope of the course."
],
"exercises": [
"List the main topic clusters that connect communication, computation, and entropy."
],
"key_terms": [
"Entropy"
]
},
{
"fragment_id": "imported-from-markdown::ultimate-limits-to-communication-and-computation::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Ultimate Limits to Communication and Computation",
"text": "Explain the broad intellectual scope of the course.",
"source_refs": [
"examples/ocw-information-entropy/course/course-home.md"
]
},
{
"fragment_id": "imported-from-markdown::ultimate-limits-to-communication-and-computation::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Ultimate Limits to Communication and Computation",
"text": "List the main topic clusters that connect communication, computation, and entropy.",
"source_refs": [
"examples/ocw-information-entropy/course/course-home.md"
]
},
{
"fragment_id": "imported-from-markdown::open-textbooks,-problem-sets,-and-programming-work::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Open Textbooks, Problem Sets, and Programming Work",
"text": "- Objective: Identify the main kinds of learning resources supplied through the course.\n- Exercise: Explain how these resource types support both conceptual study and practice.\nThe course home lists open textbooks, problem sets, problem set solutions, and programming assignments. A learner using Didactopus should treat these as complementary evidence sources rather than relying on one summary alone.",
"source_refs": [
"examples/ocw-information-entropy/course/course-home.md"
],
"objectives": [
"Identify the main kinds of learning resources supplied through the course."
],
"exercises": [
"Explain how these resource types support both conceptual study and practice."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::open-textbooks,-problem-sets,-and-programming-work::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Open Textbooks, Problem Sets, and Programming Work",
"text": "Identify the main kinds of learning resources supplied through the course.",
"source_refs": [
"examples/ocw-information-entropy/course/course-home.md"
]
},
{
"fragment_id": "imported-from-markdown::open-textbooks,-problem-sets,-and-programming-work::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Open Textbooks, Problem Sets, and Programming Work",
"text": "Explain how these resource types support both conceptual study and practice.",
"source_refs": [
"examples/ocw-information-entropy/course/course-home.md"
]
},
{
"fragment_id": "imported-from-markdown::mit-ocw-6.050j-information-and-entropy:-syllabus::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "MIT OCW 6.050J Information and Entropy: Syllabus",
"text": "Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/\nAttribution: adapted from the MIT OpenCourseWare syllabus page for 6.050J Information and Entropy.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
],
"objectives": [],
"exercises": [],
"key_terms": [
"Information",
"Entropy"
]
},
{
"fragment_id": "imported-from-markdown::prerequisites-and-mathematical-background::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Prerequisites and Mathematical Background",
"text": "- Objective: Explain the mathematical maturity expected by the course.\n- Exercise: Decide whether a learner needs review in probability, linear algebra, or signals before beginning.\nThe syllabus expects a foundation comparable to MIT subjects in calculus and linear algebra, together with comfort in probability, signals, and basic programming. Didactopus should therefore surface prerequisite review when those foundations appear weak.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
],
"objectives": [
"Explain the mathematical maturity expected by the course."
],
"exercises": [
"Decide whether a learner needs review in probability, linear algebra, or signals before beginning."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::prerequisites-and-mathematical-background::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Prerequisites and Mathematical Background",
"text": "Explain the mathematical maturity expected by the course.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
]
},
{
"fragment_id": "imported-from-markdown::prerequisites-and-mathematical-background::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Prerequisites and Mathematical Background",
"text": "Decide whether a learner needs review in probability, linear algebra, or signals before beginning.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
]
},
{
"fragment_id": "imported-from-markdown::assessment-structure::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Assessment Structure",
"text": "- Objective: Identify the role of problem sets, exams, and programming work in the course.\n- Exercise: Build a study schedule that alternates reading, derivation, and worked exercises.\nThe syllabus emphasizes regular problem solving and quantitative reasoning. The course is not only a reading list: learners are expected to derive results, solve structured problems, and connect abstract arguments to implementation-oriented tasks.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
],
"objectives": [
"Identify the role of problem sets, exams, and programming work in the course."
],
"exercises": [
"Build a study schedule that alternates reading, derivation, and worked exercises."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::assessment-structure::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Assessment Structure",
"text": "Identify the role of problem sets, exams, and programming work in the course.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
]
},
{
"fragment_id": "imported-from-markdown::assessment-structure::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Assessment Structure",
"text": "Build a study schedule that alternates reading, derivation, and worked exercises.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
]
},
{
"fragment_id": "imported-from-markdown::course-notes-and-reference-texts::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Course Notes and Reference Texts",
"text": "- Objective: Explain how the course notes and textbook references supply the core conceptual sequence.\n- Exercise: Compare when to use course notes versus outside references for clarification.\nMIT OCW links course notes and textbook-style resources through the syllabus and resource pages. The intended use is cumulative: earlier notes establish counting, probability, and entropy, while later materials expand into coding, noise, secrecy, thermodynamics, and computation.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
],
"objectives": [
"Explain how the course notes and textbook references supply the core conceptual sequence."
],
"exercises": [
"Compare when to use course notes versus outside references for clarification."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::course-notes-and-reference-texts::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Course Notes and Reference Texts",
"text": "Explain how the course notes and textbook references supply the core conceptual sequence.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
]
},
{
"fragment_id": "imported-from-markdown::course-notes-and-reference-texts::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Course Notes and Reference Texts",
"text": "Compare when to use course notes versus outside references for clarification.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
]
},
{
"fragment_id": "imported-from-markdown::independent-reasoning-and-careful-comparison::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Independent Reasoning and Careful Comparison",
"text": "- Objective: Explain why the course requires precise comparison of related but non-identical concepts.\n- Exercise: Write a short note distinguishing Shannon entropy, channel capacity, and thermodynamic entropy.\nThe syllabus framing implies a style of work where analogy is useful but dangerous when used loosely. Learners must compare models carefully, state assumptions, and notice where similar mathematics does not imply identical interpretation.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
],
"objectives": [
"Explain why the course requires precise comparison of related but non-identical concepts."
],
"exercises": [
"Write a short note distinguishing Shannon entropy, channel capacity, and thermodynamic entropy."
],
"key_terms": [
"Shannon",
"Learners"
]
},
{
"fragment_id": "imported-from-markdown::independent-reasoning-and-careful-comparison::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Independent Reasoning and Careful Comparison",
"text": "Explain why the course requires precise comparison of related but non-identical concepts.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
]
},
{
"fragment_id": "imported-from-markdown::independent-reasoning-and-careful-comparison::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Independent Reasoning and Careful Comparison",
"text": "Write a short note distinguishing Shannon entropy, channel capacity, and thermodynamic entropy.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
]
},
{
"fragment_id": "imported-from-markdown::mit-ocw-6.050j-information-and-entropy:-unit-sequence::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "MIT OCW 6.050J Information and Entropy: Unit Sequence",
"text": "Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/\nAttribution: adapted from the MIT OpenCourseWare unit progression and resource organization for 6.050J Information and Entropy.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [],
"exercises": [],
"key_terms": [
"Information",
"Entropy"
]
},
{
"fragment_id": "imported-from-markdown::counting-and-probability::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Counting and Probability",
"text": "- Objective: Explain how counting arguments, probability spaces, and random variables support later information-theory results.\n- Exercise: Derive a simple counting argument for binary strings and compute an event probability.\nEarly units establish counting, combinatorics, and probability as the language used to reason about uncertainty, messages, and evidence.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain how counting arguments, probability spaces, and random variables support later information-theory results."
],
"exercises": [
"Derive a simple counting argument for binary strings and compute an event probability."
],
"key_terms": [
"Derive"
]
},
{
"fragment_id": "imported-from-markdown::counting-and-probability::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Counting and Probability",
"text": "Explain how counting arguments, probability spaces, and random variables support later information-theory results.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::counting-and-probability::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Counting and Probability",
"text": "Derive a simple counting argument for binary strings and compute an event probability.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::shannon-entropy::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Shannon Entropy",
"text": "- Objective: Explain Shannon entropy as a measure of uncertainty and compare high-entropy and low-entropy sources.\n- Exercise: Compute the entropy of a Bernoulli source and interpret the result.\nThe course then introduces entropy as a quantitative measure of uncertainty for a source model and uses it to reason about representation cost and surprise.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain Shannon entropy as a measure of uncertainty and compare high-entropy and low-entropy sources."
],
"exercises": [
"Compute the entropy of a Bernoulli source and interpret the result."
],
"key_terms": [
"Shannon",
"Bernoulli"
]
},
{
"fragment_id": "imported-from-markdown::shannon-entropy::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Shannon Entropy",
"text": "Explain Shannon entropy as a measure of uncertainty and compare high-entropy and low-entropy sources.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::shannon-entropy::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Shannon Entropy",
"text": "Compute the entropy of a Bernoulli source and interpret the result.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::mutual-information::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Mutual Information",
"text": "- Objective: Explain mutual information and relate it to dependence between signals or observations.\n- Exercise: Compare independent variables with dependent variables using mutual-information reasoning.\nThese units ask the learner to understand how observation changes uncertainty and what it means for one variable to carry information about another.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain mutual information and relate it to dependence between signals or observations."
],
"exercises": [
"Compare independent variables with dependent variables using mutual-information reasoning."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::mutual-information::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Mutual Information",
"text": "Explain mutual information and relate it to dependence between signals or observations.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::mutual-information::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Mutual Information",
"text": "Compare independent variables with dependent variables using mutual-information reasoning.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::source-coding-and-compression::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Source Coding and Compression",
"text": "- Objective: Explain lossless compression in terms of entropy, redundancy, and coding choices.\n- Exercise: Describe when compression succeeds and when it fails on already-random data.\nThe course develops the idea that structured sources can often be described more efficiently, but only up to limits implied by entropy.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain lossless compression in terms of entropy, redundancy, and coding choices."
],
"exercises": [
"Describe when compression succeeds and when it fails on already-random data."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::source-coding-and-compression::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Source Coding and Compression",
"text": "Explain lossless compression in terms of entropy, redundancy, and coding choices.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::source-coding-and-compression::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Source Coding and Compression",
"text": "Describe when compression succeeds and when it fails on already-random data.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::huffman-coding::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Huffman Coding",
"text": "- Objective: Explain Huffman coding and justify why likely symbols receive shorter descriptions.\n- Exercise: Build a Huffman code for a small source alphabet.\nLearners use trees and expected length arguments to connect probability models to practical code design.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain Huffman coding and justify why likely symbols receive shorter descriptions."
],
"exercises": [
"Build a Huffman code for a small source alphabet."
],
"key_terms": [
"Huffman",
"Learners"
]
},
{
"fragment_id": "imported-from-markdown::huffman-coding::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Huffman Coding",
"text": "Explain Huffman coding and justify why likely symbols receive shorter descriptions.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::huffman-coding::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Huffman Coding",
"text": "Build a Huffman code for a small source alphabet.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::channel-capacity::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Channel Capacity",
"text": "- Objective: Explain channel capacity as a limit on reliable communication over a noisy channel.\n- Exercise: State why reliable transmission above capacity is impossible in the long run.\nThe course treats capacity as a fundamental upper bound and frames noisy communication in terms of rates, inference, and uncertainty reduction.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain channel capacity as a limit on reliable communication over a noisy channel."
],
"exercises": [
"State why reliable transmission above capacity is impossible in the long run."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::channel-capacity::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Channel Capacity",
"text": "Explain channel capacity as a limit on reliable communication over a noisy channel.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::channel-capacity::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Channel Capacity",
"text": "State why reliable transmission above capacity is impossible in the long run.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::channel-coding::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Channel Coding",
"text": "- Objective: Explain how channel coding adds redundancy to protect messages from noise.\n- Exercise: Contrast uncoded transmission with coded transmission on a noisy channel.\nThese units emphasize that redundancy can be wasteful in compression but essential in communication under uncertainty.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain how channel coding adds redundancy to protect messages from noise."
],
"exercises": [
"Contrast uncoded transmission with coded transmission on a noisy channel."
],
"key_terms": [
"Contrast"
]
},
{
"fragment_id": "imported-from-markdown::channel-coding::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Channel Coding",
"text": "Explain how channel coding adds redundancy to protect messages from noise.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::channel-coding::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Channel Coding",
"text": "Contrast uncoded transmission with coded transmission on a noisy channel.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::error-correcting-codes::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Error Correcting Codes",
"text": "- Objective: Explain how error-correcting codes detect or repair corrupted symbols.\n- Exercise: Describe a simple parity-style code and its limits.\nThe learner must connect abstract limits to concrete coding mechanisms and understand both strengths and failure modes.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain how error-correcting codes detect or repair corrupted symbols."
],
"exercises": [
"Describe a simple parity-style code and its limits."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::error-correcting-codes::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Error Correcting Codes",
"text": "Explain how error-correcting codes detect or repair corrupted symbols.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::error-correcting-codes::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Error Correcting Codes",
"text": "Describe a simple parity-style code and its limits.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::cryptography-and-information-hiding::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Cryptography and Information Hiding",
"text": "- Objective: Explain the relationship between secrecy, information leakage, and coded communication.\n- Exercise: Compare a secure scheme with a weak one in terms of revealed information.\nThe course extends information-theoretic reasoning to adversarial settings where controlling what an observer can infer becomes central.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain the relationship between secrecy, information leakage, and coded communication."
],
"exercises": [
"Compare a secure scheme with a weak one in terms of revealed information."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::cryptography-and-information-hiding::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Cryptography and Information Hiding",
"text": "Explain the relationship between secrecy, information leakage, and coded communication.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::cryptography-and-information-hiding::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Cryptography and Information Hiding",
"text": "Compare a secure scheme with a weak one in terms of revealed information.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::thermodynamics-and-entropy::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Thermodynamics and Entropy",
"text": "- Objective: Explain how thermodynamic entropy relates to, and differs from, Shannon entropy.\n- Exercise: Compare the two entropy notions and identify what is preserved across the analogy.\nThe course uses entropy as a bridge concept between communication theory and physics while insisting on careful interpretation.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain how thermodynamic entropy relates to, and differs from, Shannon entropy."
],
"exercises": [
"Compare the two entropy notions and identify what is preserved across the analogy."
],
"key_terms": [
"Shannon"
]
},
{
"fragment_id": "imported-from-markdown::thermodynamics-and-entropy::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Thermodynamics and Entropy",
"text": "Explain how thermodynamic entropy relates to, and differs from, Shannon entropy.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::thermodynamics-and-entropy::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Thermodynamics and Entropy",
"text": "Compare the two entropy notions and identify what is preserved across the analogy.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::reversible-computation-and-quantum-computation::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Reversible Computation and Quantum Computation",
"text": "- Objective: Explain why the physical implementation of computation matters for information processing limits.\n- Exercise: Summarize how reversible computation changes the discussion of dissipation and information loss.\nLater units connect information, entropy, and computation more directly by considering reversible logic, irreversibility, and quantum information themes.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain why the physical implementation of computation matters for information processing limits."
],
"exercises": [
"Summarize how reversible computation changes the discussion of dissipation and information loss."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::reversible-computation-and-quantum-computation::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Reversible Computation and Quantum Computation",
"text": "Explain why the physical implementation of computation matters for information processing limits.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::reversible-computation-and-quantum-computation::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Reversible Computation and Quantum Computation",
"text": "Summarize how reversible computation changes the discussion of dissipation and information loss.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::course-synthesis::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Course Synthesis",
"text": "- Objective: Synthesize the course by connecting entropy, coding, reliability, secrecy, and physical interpretation in one coherent narrative.\n- Exercise: Produce a final study guide that links source coding, channel coding, secrecy, thermodynamic analogies, and computation.\nThe end of the course asks the learner to unify the mathematical and physical perspectives rather than treating the units as disconnected topics.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Synthesize the course by connecting entropy, coding, reliability, secrecy, and physical interpretation in one coherent narrative."
],
"exercises": [
"Produce a final study guide that links source coding, channel coding, secrecy, thermodynamic analogies, and computation."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::course-synthesis::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Course Synthesis",
"text": "Synthesize the course by connecting entropy, coding, reliability, secrecy, and physical interpretation in one coherent narrative.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::course-synthesis::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Course Synthesis",
"text": "Produce a final study guide that links source coding, channel coding, secrecy, thermodynamic analogies, and computation.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
}
]
}

View File

@ -6,7 +6,20 @@ sources:
creator: MIT OpenCourseWare creator: MIT OpenCourseWare
license_id: CC BY-NC-SA 4.0 license_id: CC BY-NC-SA 4.0
license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/ license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/
retrieved_at: "2026-03-14" retrieved_at: "2026-03-16"
adapted: true
attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0.
excluded_from_upstream_license: false
exclusion_notes: ""
- source_id: mit-ocw-6-050j-syllabus
title: MIT OpenCourseWare 6.050J Information and Entropy syllabus
url: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/
publisher: Massachusetts Institute of Technology
creator: MIT OpenCourseWare
license_id: CC BY-NC-SA 4.0
license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/
retrieved_at: "2026-03-16"
adapted: true adapted: true
attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0. attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0.
excluded_from_upstream_license: false excluded_from_upstream_license: false
@ -19,7 +32,7 @@ sources:
creator: MIT OpenCourseWare creator: MIT OpenCourseWare
license_id: CC BY-NC-SA 4.0 license_id: CC BY-NC-SA 4.0
license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/ license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/
retrieved_at: "2026-03-14" retrieved_at: "2026-03-16"
adapted: true adapted: true
attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0. attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0.
excluded_from_upstream_license: false excluded_from_upstream_license: false
@ -32,7 +45,7 @@ sources:
creator: MIT OpenCourseWare creator: MIT OpenCourseWare
license_id: CC BY-NC-SA 4.0 license_id: CC BY-NC-SA 4.0
license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/ license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/
retrieved_at: "2026-03-14" retrieved_at: "2026-03-16"
adapted: true adapted: true
attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0. attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0.
excluded_from_upstream_license: false excluded_from_upstream_license: false

View File

@ -3,9 +3,54 @@
"domain": "MIT OCW Information and Entropy", "domain": "MIT OCW Information and Entropy",
"artifacts": [ "artifacts": [
{ {
"concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy", "concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home",
"artifact_type": "symbolic", "artifact_type": "symbolic",
"artifact_name": "mit-ocw-6-050j-information-and-entropy.md" "artifact_name": "mit-ocw-6-050j-information-and-entropy-course-home.md"
},
{
"concept": "mit-ocw-information-and-entropy::information-and-entropy",
"artifact_type": "symbolic",
"artifact_name": "information-and-entropy.md"
},
{
"concept": "mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation",
"artifact_type": "symbolic",
"artifact_name": "ultimate-limits-to-communication-and-computation.md"
},
{
"concept": "mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work",
"artifact_type": "symbolic",
"artifact_name": "open-textbooks-problem-sets-and-programming-work.md"
},
{
"concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus",
"artifact_type": "symbolic",
"artifact_name": "mit-ocw-6-050j-information-and-entropy-syllabus.md"
},
{
"concept": "mit-ocw-information-and-entropy::prerequisites-and-mathematical-background",
"artifact_type": "symbolic",
"artifact_name": "prerequisites-and-mathematical-background.md"
},
{
"concept": "mit-ocw-information-and-entropy::assessment-structure",
"artifact_type": "symbolic",
"artifact_name": "assessment-structure.md"
},
{
"concept": "mit-ocw-information-and-entropy::course-notes-and-reference-texts",
"artifact_type": "symbolic",
"artifact_name": "course-notes-and-reference-texts.md"
},
{
"concept": "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison",
"artifact_type": "symbolic",
"artifact_name": "independent-reasoning-and-careful-comparison.md"
},
{
"concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence",
"artifact_type": "symbolic",
"artifact_name": "mit-ocw-6-050j-information-and-entropy-unit-sequence.md"
}, },
{ {
"concept": "mit-ocw-information-and-entropy::counting-and-probability", "concept": "mit-ocw-information-and-entropy::counting-and-probability",
@ -23,9 +68,9 @@
"artifact_name": "mutual-information.md" "artifact_name": "mutual-information.md"
}, },
{ {
"concept": "mit-ocw-information-and-entropy::data-compression", "concept": "mit-ocw-information-and-entropy::source-coding-and-compression",
"artifact_type": "symbolic", "artifact_type": "symbolic",
"artifact_name": "data-compression.md" "artifact_name": "source-coding-and-compression.md"
}, },
{ {
"concept": "mit-ocw-information-and-entropy::huffman-coding", "concept": "mit-ocw-information-and-entropy::huffman-coding",

View File

@ -3,24 +3,42 @@
"display_name": "OCW Information Entropy Agent", "display_name": "OCW Information Entropy Agent",
"domain": "MIT OCW Information and Entropy", "domain": "MIT OCW Information and Entropy",
"mastered_concepts": [ "mastered_concepts": [
"mit-ocw-information-and-entropy::assessment-structure",
"mit-ocw-information-and-entropy::channel-capacity", "mit-ocw-information-and-entropy::channel-capacity",
"mit-ocw-information-and-entropy::channel-coding", "mit-ocw-information-and-entropy::channel-coding",
"mit-ocw-information-and-entropy::counting-and-probability", "mit-ocw-information-and-entropy::counting-and-probability",
"mit-ocw-information-and-entropy::course-notes-and-reference-texts",
"mit-ocw-information-and-entropy::cryptography-and-information-hiding", "mit-ocw-information-and-entropy::cryptography-and-information-hiding",
"mit-ocw-information-and-entropy::data-compression",
"mit-ocw-information-and-entropy::error-correcting-codes", "mit-ocw-information-and-entropy::error-correcting-codes",
"mit-ocw-information-and-entropy::huffman-coding", "mit-ocw-information-and-entropy::huffman-coding",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy", "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison",
"mit-ocw-information-and-entropy::information-and-entropy",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence",
"mit-ocw-information-and-entropy::mutual-information", "mit-ocw-information-and-entropy::mutual-information",
"mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work",
"mit-ocw-information-and-entropy::prerequisites-and-mathematical-background",
"mit-ocw-information-and-entropy::shannon-entropy", "mit-ocw-information-and-entropy::shannon-entropy",
"mit-ocw-information-and-entropy::thermodynamics-and-entropy" "mit-ocw-information-and-entropy::source-coding-and-compression",
"mit-ocw-information-and-entropy::thermodynamics-and-entropy",
"mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation"
], ],
"weak_dimensions_by_concept": { "weak_dimensions_by_concept": {
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy": [], "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home": [],
"mit-ocw-information-and-entropy::information-and-entropy": [],
"mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation": [],
"mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work": [],
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus": [],
"mit-ocw-information-and-entropy::prerequisites-and-mathematical-background": [],
"mit-ocw-information-and-entropy::assessment-structure": [],
"mit-ocw-information-and-entropy::course-notes-and-reference-texts": [],
"mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison": [],
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence": [],
"mit-ocw-information-and-entropy::counting-and-probability": [], "mit-ocw-information-and-entropy::counting-and-probability": [],
"mit-ocw-information-and-entropy::shannon-entropy": [], "mit-ocw-information-and-entropy::shannon-entropy": [],
"mit-ocw-information-and-entropy::mutual-information": [], "mit-ocw-information-and-entropy::mutual-information": [],
"mit-ocw-information-and-entropy::data-compression": [], "mit-ocw-information-and-entropy::source-coding-and-compression": [],
"mit-ocw-information-and-entropy::huffman-coding": [], "mit-ocw-information-and-entropy::huffman-coding": [],
"mit-ocw-information-and-entropy::channel-capacity": [], "mit-ocw-information-and-entropy::channel-capacity": [],
"mit-ocw-information-and-entropy::channel-coding": [], "mit-ocw-information-and-entropy::channel-coding": [],
@ -29,7 +47,52 @@
"mit-ocw-information-and-entropy::thermodynamics-and-entropy": [] "mit-ocw-information-and-entropy::thermodynamics-and-entropy": []
}, },
"evaluator_summary_by_concept": { "evaluator_summary_by_concept": {
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy": { "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::information-and-entropy": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::prerequisites-and-mathematical-background": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::assessment-structure": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::course-notes-and-reference-texts": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence": {
"correctness": 0.8400000000000001, "correctness": 0.8400000000000001,
"explanation": 0.85, "explanation": 0.85,
"critique": 0.7999999999999999 "critique": 0.7999999999999999
@ -49,7 +112,7 @@
"explanation": 0.85, "explanation": 0.85,
"critique": 0.7999999999999999 "critique": 0.7999999999999999
}, },
"mit-ocw-information-and-entropy::data-compression": { "mit-ocw-information-and-entropy::source-coding-and-compression": {
"correctness": 0.8400000000000001, "correctness": 0.8400000000000001,
"explanation": 0.85, "explanation": 0.85,
"critique": 0.7999999999999999 "critique": 0.7999999999999999
@ -87,9 +150,54 @@
}, },
"artifacts": [ "artifacts": [
{ {
"concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy", "concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home",
"artifact_type": "symbolic", "artifact_type": "symbolic",
"artifact_name": "mit-ocw-6-050j-information-and-entropy.md" "artifact_name": "mit-ocw-6-050j-information-and-entropy-course-home.md"
},
{
"concept": "mit-ocw-information-and-entropy::information-and-entropy",
"artifact_type": "symbolic",
"artifact_name": "information-and-entropy.md"
},
{
"concept": "mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation",
"artifact_type": "symbolic",
"artifact_name": "ultimate-limits-to-communication-and-computation.md"
},
{
"concept": "mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work",
"artifact_type": "symbolic",
"artifact_name": "open-textbooks-problem-sets-and-programming-work.md"
},
{
"concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus",
"artifact_type": "symbolic",
"artifact_name": "mit-ocw-6-050j-information-and-entropy-syllabus.md"
},
{
"concept": "mit-ocw-information-and-entropy::prerequisites-and-mathematical-background",
"artifact_type": "symbolic",
"artifact_name": "prerequisites-and-mathematical-background.md"
},
{
"concept": "mit-ocw-information-and-entropy::assessment-structure",
"artifact_type": "symbolic",
"artifact_name": "assessment-structure.md"
},
{
"concept": "mit-ocw-information-and-entropy::course-notes-and-reference-texts",
"artifact_type": "symbolic",
"artifact_name": "course-notes-and-reference-texts.md"
},
{
"concept": "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison",
"artifact_type": "symbolic",
"artifact_name": "independent-reasoning-and-careful-comparison.md"
},
{
"concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence",
"artifact_type": "symbolic",
"artifact_name": "mit-ocw-6-050j-information-and-entropy-unit-sequence.md"
}, },
{ {
"concept": "mit-ocw-information-and-entropy::counting-and-probability", "concept": "mit-ocw-information-and-entropy::counting-and-probability",
@ -107,9 +215,9 @@
"artifact_name": "mutual-information.md" "artifact_name": "mutual-information.md"
}, },
{ {
"concept": "mit-ocw-information-and-entropy::data-compression", "concept": "mit-ocw-information-and-entropy::source-coding-and-compression",
"artifact_type": "symbolic", "artifact_type": "symbolic",
"artifact_name": "data-compression.md" "artifact_name": "source-coding-and-compression.md"
}, },
{ {
"concept": "mit-ocw-information-and-entropy::huffman-coding", "concept": "mit-ocw-information-and-entropy::huffman-coding",

View File

@ -4,19 +4,34 @@
- Domain: `MIT OCW Information and Entropy` - Domain: `MIT OCW Information and Entropy`
## Mastered Concepts ## Mastered Concepts
- mit-ocw-information-and-entropy::assessment-structure
- mit-ocw-information-and-entropy::channel-capacity - mit-ocw-information-and-entropy::channel-capacity
- mit-ocw-information-and-entropy::channel-coding - mit-ocw-information-and-entropy::channel-coding
- mit-ocw-information-and-entropy::counting-and-probability - mit-ocw-information-and-entropy::counting-and-probability
- mit-ocw-information-and-entropy::course-notes-and-reference-texts
- mit-ocw-information-and-entropy::cryptography-and-information-hiding - mit-ocw-information-and-entropy::cryptography-and-information-hiding
- mit-ocw-information-and-entropy::data-compression
- mit-ocw-information-and-entropy::error-correcting-codes - mit-ocw-information-and-entropy::error-correcting-codes
- mit-ocw-information-and-entropy::huffman-coding - mit-ocw-information-and-entropy::huffman-coding
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy - mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison
- mit-ocw-information-and-entropy::information-and-entropy
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence
- mit-ocw-information-and-entropy::mutual-information - mit-ocw-information-and-entropy::mutual-information
- mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work
- mit-ocw-information-and-entropy::prerequisites-and-mathematical-background
- mit-ocw-information-and-entropy::shannon-entropy - mit-ocw-information-and-entropy::shannon-entropy
- mit-ocw-information-and-entropy::source-coding-and-compression
- mit-ocw-information-and-entropy::thermodynamics-and-entropy - mit-ocw-information-and-entropy::thermodynamics-and-entropy
- mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation
## Concept Summaries ## Concept Summaries
### mit-ocw-information-and-entropy::assessment-structure
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::channel-capacity ### mit-ocw-information-and-entropy::channel-capacity
- correctness: 0.84 - correctness: 0.84
- critique: 0.80 - critique: 0.80
@ -35,13 +50,13 @@
- explanation: 0.85 - explanation: 0.85
- weak dimensions: none - weak dimensions: none
### mit-ocw-information-and-entropy::cryptography-and-information-hiding ### mit-ocw-information-and-entropy::course-notes-and-reference-texts
- correctness: 0.84 - correctness: 0.84
- critique: 0.80 - critique: 0.80
- explanation: 0.85 - explanation: 0.85
- weak dimensions: none - weak dimensions: none
### mit-ocw-information-and-entropy::data-compression ### mit-ocw-information-and-entropy::cryptography-and-information-hiding
- correctness: 0.84 - correctness: 0.84
- critique: 0.80 - critique: 0.80
- explanation: 0.85 - explanation: 0.85
@ -59,7 +74,31 @@
- explanation: 0.85 - explanation: 0.85
- weak dimensions: none - weak dimensions: none
### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy ### mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::information-and-entropy
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence
- correctness: 0.84 - correctness: 0.84
- critique: 0.80 - critique: 0.80
- explanation: 0.85 - explanation: 0.85
@ -71,24 +110,57 @@
- explanation: 0.85 - explanation: 0.85
- weak dimensions: none - weak dimensions: none
### mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::prerequisites-and-mathematical-background
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::shannon-entropy ### mit-ocw-information-and-entropy::shannon-entropy
- correctness: 0.84 - correctness: 0.84
- critique: 0.80 - critique: 0.80
- explanation: 0.85 - explanation: 0.85
- weak dimensions: none - weak dimensions: none
### mit-ocw-information-and-entropy::source-coding-and-compression
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::thermodynamics-and-entropy ### mit-ocw-information-and-entropy::thermodynamics-and-entropy
- correctness: 0.84 - correctness: 0.84
- critique: 0.80 - critique: 0.80
- explanation: 0.85 - explanation: 0.85
- weak dimensions: none - weak dimensions: none
### mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
## Artifacts ## Artifacts
- mit-ocw-6-050j-information-and-entropy.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy - mit-ocw-6-050j-information-and-entropy-course-home.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home
- information-and-entropy.md (symbolic) for mit-ocw-information-and-entropy::information-and-entropy
- ultimate-limits-to-communication-and-computation.md (symbolic) for mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation
- open-textbooks-problem-sets-and-programming-work.md (symbolic) for mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work
- mit-ocw-6-050j-information-and-entropy-syllabus.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus
- prerequisites-and-mathematical-background.md (symbolic) for mit-ocw-information-and-entropy::prerequisites-and-mathematical-background
- assessment-structure.md (symbolic) for mit-ocw-information-and-entropy::assessment-structure
- course-notes-and-reference-texts.md (symbolic) for mit-ocw-information-and-entropy::course-notes-and-reference-texts
- independent-reasoning-and-careful-comparison.md (symbolic) for mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison
- mit-ocw-6-050j-information-and-entropy-unit-sequence.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence
- counting-and-probability.md (symbolic) for mit-ocw-information-and-entropy::counting-and-probability - counting-and-probability.md (symbolic) for mit-ocw-information-and-entropy::counting-and-probability
- shannon-entropy.md (symbolic) for mit-ocw-information-and-entropy::shannon-entropy - shannon-entropy.md (symbolic) for mit-ocw-information-and-entropy::shannon-entropy
- mutual-information.md (symbolic) for mit-ocw-information-and-entropy::mutual-information - mutual-information.md (symbolic) for mit-ocw-information-and-entropy::mutual-information
- data-compression.md (symbolic) for mit-ocw-information-and-entropy::data-compression - source-coding-and-compression.md (symbolic) for mit-ocw-information-and-entropy::source-coding-and-compression
- huffman-coding.md (symbolic) for mit-ocw-information-and-entropy::huffman-coding - huffman-coding.md (symbolic) for mit-ocw-information-and-entropy::huffman-coding
- channel-capacity.md (symbolic) for mit-ocw-information-and-entropy::channel-capacity - channel-capacity.md (symbolic) for mit-ocw-information-and-entropy::channel-capacity
- channel-coding.md (symbolic) for mit-ocw-information-and-entropy::channel-coding - channel-coding.md (symbolic) for mit-ocw-information-and-entropy::channel-coding

View File

@ -1,75 +1,32 @@
{ {
"course_source": "/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy/6-050j-information-and-entropy.md", "course_source": "examples/ocw-information-entropy/course",
"pack_dir": "/home/netuser/dev/Didactopustry1/domain-packs/mit-ocw-information-entropy", "source_document_count": 3,
"skill_dir": "/home/netuser/dev/Didactopustry1/skills/ocw-information-entropy-agent", "pack_dir": "domain-packs/mit-ocw-information-entropy",
"source_inventory": "/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy/sources.yaml", "skill_dir": "skills/ocw-information-entropy-agent",
"source_inventory": "examples/ocw-information-entropy/sources.yaml",
"review_flags": [ "review_flags": [
"Module 'Imported from MARKDOWN' has no explicit exercises; mastery signals may be weak.", "Concept 'MIT OCW 6.050J Information and Entropy: Course Home' has no extracted mastery signals; review manually.",
"Concept 'MIT OCW 6.050J Information and Entropy' has no extracted mastery signals; review manually.", "Concept 'MIT OCW 6.050J Information and Entropy: Syllabus' has no extracted mastery signals; review manually.",
"Concept 'Information' has no extracted mastery signals; review manually.", "Concept 'MIT OCW 6.050J Information and Entropy: Unit Sequence' has no extracted mastery signals; review manually."
"Concept 'Entropy' has no extracted mastery signals; review manually.",
"Concept 'Source' has no extracted mastery signals; review manually.",
"Concept 'OpenCourseWare' has no extracted mastery signals; review manually.",
"Concept 'Spring' has no extracted mastery signals; review manually.",
"Concept 'Attribution' has no extracted mastery signals; review manually.",
"Concept 'Counting and Probability' has no extracted mastery signals; review manually.",
"Concept 'Counting' has no extracted mastery signals; review manually.",
"Concept 'Probability' has no extracted mastery signals; review manually.",
"Concept 'Objective' has no extracted mastery signals; review manually.",
"Concept 'Explain' has no extracted mastery signals; review manually.",
"Concept 'Exercise' has no extracted mastery signals; review manually.",
"Concept 'Derive' has no extracted mastery signals; review manually.",
"Concept 'This' has no extracted mastery signals; review manually.",
"Concept 'Random' has no extracted mastery signals; review manually.",
"Concept 'Shannon Entropy' has no extracted mastery signals; review manually.",
"Concept 'Shannon' has no extracted mastery signals; review manually.",
"Concept 'Compute' has no extracted mastery signals; review manually.",
"Concept 'Bernoulli' has no extracted mastery signals; review manually.",
"Concept 'Mutual Information' has no extracted mastery signals; review manually.",
"Concept 'Mutual' has no extracted mastery signals; review manually.",
"Concept 'Compare' has no extracted mastery signals; review manually.",
"Concept 'Dependence' has no extracted mastery signals; review manually.",
"Concept 'Data Compression' has no extracted mastery signals; review manually.",
"Concept 'Data' has no extracted mastery signals; review manually.",
"Concept 'Compression' has no extracted mastery signals; review manually.",
"Concept 'Describe' has no extracted mastery signals; review manually.",
"Concept 'Redundancy' has no extracted mastery signals; review manually.",
"Concept 'Huffman Coding' has no extracted mastery signals; review manually.",
"Concept 'Huffman' has no extracted mastery signals; review manually.",
"Concept 'Coding' has no extracted mastery signals; review manually.",
"Concept 'Build' has no extracted mastery signals; review manually.",
"Concept 'Prefix' has no extracted mastery signals; review manually.",
"Concept 'Channel Capacity' has no extracted mastery signals; review manually.",
"Concept 'Channel' has no extracted mastery signals; review manually.",
"Concept 'Capacity' has no extracted mastery signals; review manually.",
"Concept 'State' has no extracted mastery signals; review manually.",
"Concept 'Reliable' has no extracted mastery signals; review manually.",
"Concept 'Channel Coding' has no extracted mastery signals; review manually.",
"Concept 'Contrast' has no extracted mastery signals; review manually.",
"Concept 'Decoding' has no extracted mastery signals; review manually.",
"Concept 'Error Correcting Codes' has no extracted mastery signals; review manually.",
"Concept 'Error' has no extracted mastery signals; review manually.",
"Concept 'Correcting' has no extracted mastery signals; review manually.",
"Concept 'Codes' has no extracted mastery signals; review manually.",
"Concept 'Cryptography and Information Hiding' has no extracted mastery signals; review manually.",
"Concept 'Cryptography' has no extracted mastery signals; review manually.",
"Concept 'Hiding' has no extracted mastery signals; review manually.",
"Concept 'Thermodynamics and Entropy' has no extracted mastery signals; review manually.",
"Concept 'Thermodynamics' has no extracted mastery signals; review manually.",
"Concept 'Course Synthesis' has no extracted mastery signals; review manually.",
"Concept 'Course' has no extracted mastery signals; review manually.",
"Concept 'Synthesis' has no extracted mastery signals; review manually.",
"Concept 'Synthesize' has no extracted mastery signals; review manually.",
"Concept 'Produce' has no extracted mastery signals; review manually."
], ],
"concept_count": 56, "concept_count": 34,
"source_fragment_count": 60,
"target_concept": "mit-ocw-information-and-entropy::thermodynamics-and-entropy", "target_concept": "mit-ocw-information-and-entropy::thermodynamics-and-entropy",
"curriculum_path": [ "curriculum_path": [
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy", "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home",
"mit-ocw-information-and-entropy::information-and-entropy",
"mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation",
"mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus",
"mit-ocw-information-and-entropy::prerequisites-and-mathematical-background",
"mit-ocw-information-and-entropy::assessment-structure",
"mit-ocw-information-and-entropy::course-notes-and-reference-texts",
"mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence",
"mit-ocw-information-and-entropy::counting-and-probability", "mit-ocw-information-and-entropy::counting-and-probability",
"mit-ocw-information-and-entropy::shannon-entropy", "mit-ocw-information-and-entropy::shannon-entropy",
"mit-ocw-information-and-entropy::mutual-information", "mit-ocw-information-and-entropy::mutual-information",
"mit-ocw-information-and-entropy::data-compression", "mit-ocw-information-and-entropy::source-coding-and-compression",
"mit-ocw-information-and-entropy::huffman-coding", "mit-ocw-information-and-entropy::huffman-coding",
"mit-ocw-information-and-entropy::channel-capacity", "mit-ocw-information-and-entropy::channel-capacity",
"mit-ocw-information-and-entropy::channel-coding", "mit-ocw-information-and-entropy::channel-coding",
@ -78,25 +35,35 @@
"mit-ocw-information-and-entropy::thermodynamics-and-entropy" "mit-ocw-information-and-entropy::thermodynamics-and-entropy"
], ],
"mastered_concepts": [ "mastered_concepts": [
"mit-ocw-information-and-entropy::assessment-structure",
"mit-ocw-information-and-entropy::channel-capacity", "mit-ocw-information-and-entropy::channel-capacity",
"mit-ocw-information-and-entropy::channel-coding", "mit-ocw-information-and-entropy::channel-coding",
"mit-ocw-information-and-entropy::counting-and-probability", "mit-ocw-information-and-entropy::counting-and-probability",
"mit-ocw-information-and-entropy::course-notes-and-reference-texts",
"mit-ocw-information-and-entropy::cryptography-and-information-hiding", "mit-ocw-information-and-entropy::cryptography-and-information-hiding",
"mit-ocw-information-and-entropy::data-compression",
"mit-ocw-information-and-entropy::error-correcting-codes", "mit-ocw-information-and-entropy::error-correcting-codes",
"mit-ocw-information-and-entropy::huffman-coding", "mit-ocw-information-and-entropy::huffman-coding",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy", "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison",
"mit-ocw-information-and-entropy::information-and-entropy",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence",
"mit-ocw-information-and-entropy::mutual-information", "mit-ocw-information-and-entropy::mutual-information",
"mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work",
"mit-ocw-information-and-entropy::prerequisites-and-mathematical-background",
"mit-ocw-information-and-entropy::shannon-entropy", "mit-ocw-information-and-entropy::shannon-entropy",
"mit-ocw-information-and-entropy::thermodynamics-and-entropy" "mit-ocw-information-and-entropy::source-coding-and-compression",
"mit-ocw-information-and-entropy::thermodynamics-and-entropy",
"mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation"
], ],
"artifact_count": 11, "artifact_count": 20,
"compliance_manifest": "/home/netuser/dev/Didactopustry1/domain-packs/mit-ocw-information-entropy/pack_compliance_manifest.json", "compliance_manifest": "domain-packs/mit-ocw-information-entropy/pack_compliance_manifest.json",
"compliance": { "compliance": {
"pack_id": "mit-ocw-information-and-entropy", "pack_id": "mit-ocw-information-and-entropy",
"display_name": "MIT OCW Information and Entropy", "display_name": "MIT OCW Information and Entropy",
"derived_from_sources": [ "derived_from_sources": [
"mit-ocw-6-050j-course-home", "mit-ocw-6-050j-course-home",
"mit-ocw-6-050j-syllabus",
"mit-ocw-6-050j-unit-8-textbook", "mit-ocw-6-050j-unit-8-textbook",
"mit-ocw-6-050j-unit-13-textbook" "mit-ocw-6-050j-unit-13-textbook"
], ],

View File

@ -0,0 +1,25 @@
# MIT OCW 6.050J Information and Entropy: Course Home
Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/
Attribution: adapted from the MIT OpenCourseWare course home page for 6.050J Information and Entropy.
## Course Identity
### Information and Entropy
- Objective: Identify the course title, instructors, departments, level, and major topical areas.
- Exercise: Summarize the course in one paragraph for a prospective learner.
MIT OpenCourseWare presents 6.050J Information and Entropy as a Spring 2008 undergraduate subject taught by Paul Penfield and Seth Lloyd in Electrical Engineering and Computer Science together with Mechanical Engineering. The catalog framing emphasizes theory of computation, signal processing, and mathematical reasoning about information.
## Course Description
### Ultimate Limits to Communication and Computation
- Objective: Explain the broad intellectual scope of the course.
- Exercise: List the main topic clusters that connect communication, computation, and entropy.
The course examines the ultimate limits to communication and computation with emphasis on the physical nature of information processing. The source description highlights information and computation, digital signals, codes and compression, noise, probability, error correction, reversible and irreversible operations, physics of computation, and quantum computation. Entropy is explicitly connected both to channel capacity and to the second law of thermodynamics.
## Resource Types
### Open Textbooks, Problem Sets, and Programming Work
- Objective: Identify the main kinds of learning resources supplied through the course.
- Exercise: Explain how these resource types support both conceptual study and practice.
The course home lists open textbooks, problem sets, problem set solutions, and programming assignments. A learner using Didactopus should treat these as complementary evidence sources rather than relying on one summary alone.

View File

@ -0,0 +1,30 @@
# MIT OCW 6.050J Information and Entropy: Syllabus
Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/
Attribution: adapted from the MIT OpenCourseWare syllabus page for 6.050J Information and Entropy.
## Course Logistics
### Prerequisites and Mathematical Background
- Objective: Explain the mathematical maturity expected by the course.
- Exercise: Decide whether a learner needs review in probability, linear algebra, or signals before beginning.
The syllabus expects a foundation comparable to MIT subjects in calculus and linear algebra, together with comfort in probability, signals, and basic programming. Didactopus should therefore surface prerequisite review when those foundations appear weak.
### Assessment Structure
- Objective: Identify the role of problem sets, exams, and programming work in the course.
- Exercise: Build a study schedule that alternates reading, derivation, and worked exercises.
The syllabus emphasizes regular problem solving and quantitative reasoning. The course is not only a reading list: learners are expected to derive results, solve structured problems, and connect abstract arguments to implementation-oriented tasks.
## Reading Base
### Course Notes and Reference Texts
- Objective: Explain how the course notes and textbook references supply the core conceptual sequence.
- Exercise: Compare when to use course notes versus outside references for clarification.
MIT OCW links course notes and textbook-style resources through the syllabus and resource pages. The intended use is cumulative: earlier notes establish counting, probability, and entropy, while later materials expand into coding, noise, secrecy, thermodynamics, and computation.
## Learning Norms
### Independent Reasoning and Careful Comparison
- Objective: Explain why the course requires precise comparison of related but non-identical concepts.
- Exercise: Write a short note distinguishing Shannon entropy, channel capacity, and thermodynamic entropy.
The syllabus framing implies a style of work where analogy is useful but dangerous when used loosely. Learners must compare models carefully, state assumptions, and notice where similar mathematics does not imply identical interpretation.

View File

@ -0,0 +1,72 @@
# MIT OCW 6.050J Information and Entropy: Unit Sequence
Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/
Attribution: adapted from the MIT OpenCourseWare unit progression and resource organization for 6.050J Information and Entropy.
## Foundations
### Counting and Probability
- Objective: Explain how counting arguments, probability spaces, and random variables support later information-theory results.
- Exercise: Derive a simple counting argument for binary strings and compute an event probability.
Early units establish counting, combinatorics, and probability as the language used to reason about uncertainty, messages, and evidence.
### Shannon Entropy
- Objective: Explain Shannon entropy as a measure of uncertainty and compare high-entropy and low-entropy sources.
- Exercise: Compute the entropy of a Bernoulli source and interpret the result.
The course then introduces entropy as a quantitative measure of uncertainty for a source model and uses it to reason about representation cost and surprise.
### Mutual Information
- Objective: Explain mutual information and relate it to dependence between signals or observations.
- Exercise: Compare independent variables with dependent variables using mutual-information reasoning.
These units ask the learner to understand how observation changes uncertainty and what it means for one variable to carry information about another.
## Coding and Compression
### Source Coding and Compression
- Objective: Explain lossless compression in terms of entropy, redundancy, and coding choices.
- Exercise: Describe when compression succeeds and when it fails on already-random data.
The course develops the idea that structured sources can often be described more efficiently, but only up to limits implied by entropy.
### Huffman Coding
- Objective: Explain Huffman coding and justify why likely symbols receive shorter descriptions.
- Exercise: Build a Huffman code for a small source alphabet.
Learners use trees and expected length arguments to connect probability models to practical code design.
## Communication Under Noise
### Channel Capacity
- Objective: Explain channel capacity as a limit on reliable communication over a noisy channel.
- Exercise: State why reliable transmission above capacity is impossible in the long run.
The course treats capacity as a fundamental upper bound and frames noisy communication in terms of rates, inference, and uncertainty reduction.
### Channel Coding
- Objective: Explain how channel coding adds redundancy to protect messages from noise.
- Exercise: Contrast uncoded transmission with coded transmission on a noisy channel.
These units emphasize that redundancy can be wasteful in compression but essential in communication under uncertainty.
### Error Correcting Codes
- Objective: Explain how error-correcting codes detect or repair corrupted symbols.
- Exercise: Describe a simple parity-style code and its limits.
The learner must connect abstract limits to concrete coding mechanisms and understand both strengths and failure modes.
## Broader Applications
### Cryptography and Information Hiding
- Objective: Explain the relationship between secrecy, information leakage, and coded communication.
- Exercise: Compare a secure scheme with a weak one in terms of revealed information.
The course extends information-theoretic reasoning to adversarial settings where controlling what an observer can infer becomes central.
### Thermodynamics and Entropy
- Objective: Explain how thermodynamic entropy relates to, and differs from, Shannon entropy.
- Exercise: Compare the two entropy notions and identify what is preserved across the analogy.
The course uses entropy as a bridge concept between communication theory and physics while insisting on careful interpretation.
### Reversible Computation and Quantum Computation
- Objective: Explain why the physical implementation of computation matters for information processing limits.
- Exercise: Summarize how reversible computation changes the discussion of dissipation and information loss.
Later units connect information, entropy, and computation more directly by considering reversible logic, irreversibility, and quantum information themes.
### Course Synthesis
- Objective: Synthesize the course by connecting entropy, coding, reliability, secrecy, and physical interpretation in one coherent narrative.
- Exercise: Produce a final study guide that links source coding, channel coding, secrecy, thermodynamic analogies, and computation.
The end of the course asks the learner to unify the mathematical and physical perspectives rather than treating the units as disconnected topics.

View File

@ -6,7 +6,20 @@ sources:
creator: MIT OpenCourseWare creator: MIT OpenCourseWare
license_id: CC BY-NC-SA 4.0 license_id: CC BY-NC-SA 4.0
license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/ license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/
retrieved_at: "2026-03-14" retrieved_at: "2026-03-16"
adapted: true
attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0.
excluded_from_upstream_license: false
exclusion_notes: ""
- source_id: mit-ocw-6-050j-syllabus
title: MIT OpenCourseWare 6.050J Information and Entropy syllabus
url: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/
publisher: Massachusetts Institute of Technology
creator: MIT OpenCourseWare
license_id: CC BY-NC-SA 4.0
license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/
retrieved_at: "2026-03-16"
adapted: true adapted: true
attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0. attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0.
excluded_from_upstream_license: false excluded_from_upstream_license: false
@ -19,7 +32,7 @@ sources:
creator: MIT OpenCourseWare creator: MIT OpenCourseWare
license_id: CC BY-NC-SA 4.0 license_id: CC BY-NC-SA 4.0
license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/ license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/
retrieved_at: "2026-03-14" retrieved_at: "2026-03-16"
adapted: true adapted: true
attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0. attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0.
excluded_from_upstream_license: false excluded_from_upstream_license: false
@ -32,7 +45,7 @@ sources:
creator: MIT OpenCourseWare creator: MIT OpenCourseWare
license_id: CC BY-NC-SA 4.0 license_id: CC BY-NC-SA 4.0
license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/ license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/
retrieved_at: "2026-03-14" retrieved_at: "2026-03-16"
adapted: true adapted: true
attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0. attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0.
excluded_from_upstream_license: false excluded_from_upstream_license: false

View File

@ -12,8 +12,9 @@ Use this skill when the task is about tutoring, evaluating, or planning study in
1. Read `references/generated-course-summary.md` for the pack structure and target concepts. 1. Read `references/generated-course-summary.md` for the pack structure and target concepts.
2. Read `references/generated-capability-summary.md` to understand what the demo AI learner already mastered. 2. Read `references/generated-capability-summary.md` to understand what the demo AI learner already mastered.
3. Use `assets/generated/pack/` as the source of truth for concept ids, prerequisites, and mastery signals. 3. Use `assets/generated/pack/` as the source of truth for concept ids, prerequisites, and mastery signals.
4. When giving guidance, preserve the pack ordering from fundamentals through coding and thermodynamics. 4. Use `assets/generated/pack/source_corpus.json` to ground explanations in the ingested source material before relying on model prior knowledge.
5. When uncertain, say which concept or prerequisite in the generated pack is underspecified. 5. When giving guidance, preserve the pack ordering from fundamentals through coding and thermodynamics.
6. When uncertain, say which concept or prerequisite in the generated pack is underspecified and which source fragment would need review.
## Outputs ## Outputs

View File

@ -1,54 +1,203 @@
concepts: concepts:
- id: mit-ocw-6-050j-information-and-entropy - id: mit-ocw-6-050j-information-and-entropy-course-home
title: MIT OCW 6.050J Information and Entropy title: 'MIT OCW 6.050J Information and Entropy: Course Home'
description: 'Source: MIT OpenCourseWare 6.050J Information and Entropy, Spring description: 'Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/
2008.
Attribution: adapted from the OCW course overview, unit sequence, and assigned Attribution: adapted from the MIT OpenCourseWare course home page for 6.050J Information
textbook references.' and Entropy.'
prerequisites: [] prerequisites: []
mastery_signals: [] mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: information - id: information-and-entropy
title: Information title: Information and Entropy
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information description: '- Objective: Identify the course title, instructors, departments,
and Entropy'. level, and major topical areas.
- Exercise: Summarize the course in one paragraph for a prospective learner.
MIT OpenCourseWare presents 6.050J Information and Entropy as a S'
prerequisites:
- mit-ocw-6-050j-information-and-entropy-course-home
mastery_signals:
- Identify the course title, instructors, departments, level, and major topical
areas.
mastery_profile: {}
- id: paul
title: Paul
description: Candidate concept extracted from lesson 'Information and Entropy'.
prerequisites: [] prerequisites: []
mastery_signals: [] mastery_signals:
- Identify the course title, instructors, departments, level, and major topical
areas.
mastery_profile: {}
- id: penfield
title: Penfield
description: Candidate concept extracted from lesson 'Information and Entropy'.
prerequisites: []
mastery_signals:
- Identify the course title, instructors, departments, level, and major topical
areas.
mastery_profile: {}
- id: seth
title: Seth
description: Candidate concept extracted from lesson 'Information and Entropy'.
prerequisites: []
mastery_signals:
- Identify the course title, instructors, departments, level, and major topical
areas.
mastery_profile: {}
- id: lloyd
title: Lloyd
description: Candidate concept extracted from lesson 'Information and Entropy'.
prerequisites: []
mastery_signals:
- Identify the course title, instructors, departments, level, and major topical
areas.
mastery_profile: {}
- id: electrical
title: Electrical
description: Candidate concept extracted from lesson 'Information and Entropy'.
prerequisites: []
mastery_signals:
- Identify the course title, instructors, departments, level, and major topical
areas.
mastery_profile: {}
- id: engineering
title: Engineering
description: Candidate concept extracted from lesson 'Information and Entropy'.
prerequisites: []
mastery_signals:
- Identify the course title, instructors, departments, level, and major topical
areas.
mastery_profile: {}
- id: ultimate-limits-to-communication-and-computation
title: Ultimate Limits to Communication and Computation
description: '- Objective: Explain the broad intellectual scope of the course.
- Exercise: List the main topic clusters that connect communication, computation,
and entropy.
The course examines the ultimate limits to communication and computation with
em'
prerequisites:
- information-and-entropy
mastery_signals:
- Explain the broad intellectual scope of the course.
mastery_profile: {} mastery_profile: {}
- id: entropy - id: entropy
title: Entropy title: Entropy
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information description: Candidate concept extracted from lesson 'Ultimate Limits to Communication
and Entropy'. and Computation'.
prerequisites: [] prerequisites: []
mastery_signals:
- Explain the broad intellectual scope of the course.
mastery_profile: {}
- id: open-textbooks-problem-sets-and-programming-work
title: Open Textbooks, Problem Sets, and Programming Work
description: '- Objective: Identify the main kinds of learning resources supplied
through the course.
- Exercise: Explain how these resource types support both conceptual study and
practice.
The course home lists open textbooks, problem sets, problem set'
prerequisites:
- ultimate-limits-to-communication-and-computation
mastery_signals:
- Identify the main kinds of learning resources supplied through the course.
mastery_profile: {}
- id: mit-ocw-6-050j-information-and-entropy-syllabus
title: 'MIT OCW 6.050J Information and Entropy: Syllabus'
description: 'Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/
Attribution: adapted from the MIT OpenCourseWare syllabus page for 6.050J Information
and Entropy.'
prerequisites:
- open-textbooks-problem-sets-and-programming-work
mastery_signals: [] mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: source - id: prerequisites-and-mathematical-background
title: Source title: Prerequisites and Mathematical Background
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information description: '- Objective: Explain the mathematical maturity expected by the course.
and Entropy'.
prerequisites: [] - Exercise: Decide whether a learner needs review in probability, linear algebra,
mastery_signals: [] or signals before beginning.
The syllabus expects a foundation comparable to MIT subjec'
prerequisites:
- mit-ocw-6-050j-information-and-entropy-syllabus
mastery_signals:
- Explain the mathematical maturity expected by the course.
mastery_profile: {} mastery_profile: {}
- id: opencourseware - id: assessment-structure
title: OpenCourseWare title: Assessment Structure
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information description: '- Objective: Identify the role of problem sets, exams, and programming
and Entropy'. work in the course.
prerequisites: []
mastery_signals: [] - Exercise: Build a study schedule that alternates reading, derivation, and worked
exercises.
The syllabus emphasizes regular problem solving and qua'
prerequisites:
- prerequisites-and-mathematical-background
mastery_signals:
- Identify the role of problem sets, exams, and programming work in the course.
mastery_profile: {} mastery_profile: {}
- id: spring - id: course-notes-and-reference-texts
title: Spring title: Course Notes and Reference Texts
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information description: '- Objective: Explain how the course notes and textbook references
and Entropy'. supply the core conceptual sequence.
prerequisites: []
mastery_signals: [] - Exercise: Compare when to use course notes versus outside references for clarification.
MIT OCW links course notes and textbook-style r'
prerequisites:
- assessment-structure
mastery_signals:
- Explain how the course notes and textbook references supply the core conceptual
sequence.
mastery_profile: {} mastery_profile: {}
- id: attribution - id: independent-reasoning-and-careful-comparison
title: Attribution title: Independent Reasoning and Careful Comparison
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information description: '- Objective: Explain why the course requires precise comparison of
and Entropy'. related but non-identical concepts.
- Exercise: Write a short note distinguishing Shannon entropy, channel capacity,
and thermodynamic entropy.
The syllabus framing implies'
prerequisites:
- course-notes-and-reference-texts
mastery_signals:
- Explain why the course requires precise comparison of related but non-identical
concepts.
mastery_profile: {}
- id: shannon
title: Shannon
description: Candidate concept extracted from lesson 'Independent Reasoning and
Careful Comparison'.
prerequisites: [] prerequisites: []
mastery_signals:
- Explain why the course requires precise comparison of related but non-identical
concepts.
mastery_profile: {}
- id: learners
title: Learners
description: Candidate concept extracted from lesson 'Independent Reasoning and
Careful Comparison'.
prerequisites: []
mastery_signals:
- Explain why the course requires precise comparison of related but non-identical
concepts.
mastery_profile: {}
- id: mit-ocw-6-050j-information-and-entropy-unit-sequence
title: 'MIT OCW 6.050J Information and Entropy: Unit Sequence'
description: 'Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/
Attribution: adapted from the MIT OpenCourseWare unit progression and resource
organization for 6.050J Information and Entropy.'
prerequisites:
- independent-reasoning-and-careful-comparison
mastery_signals: [] mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: counting-and-probability - id: counting-and-probability
@ -59,284 +208,132 @@ concepts:
- Exercise: Derive a simple counting argument for binary strings and compute an - Exercise: Derive a simple counting argument for binary strings and compute an
event probability. event probability.
This lesson i' Early units e'
prerequisites: prerequisites:
- mit-ocw-6-050j-information-and-entropy - mit-ocw-6-050j-information-and-entropy-unit-sequence
mastery_signals: [] mastery_signals:
mastery_profile: {} - Explain how counting arguments, probability spaces, and random variables support
- id: counting later information-theory results.
title: Counting
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: probability
title: Probability
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: objective
title: Objective
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: explain
title: Explain
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: exercise
title: Exercise
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: derive - id: derive
title: Derive title: Derive
description: Candidate concept extracted from lesson 'Counting and Probability'. description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: [] prerequisites: []
mastery_signals: [] mastery_signals:
mastery_profile: {} - Explain how counting arguments, probability spaces, and random variables support
- id: this later information-theory results.
title: This
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: random
title: Random
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: shannon-entropy - id: shannon-entropy
title: Shannon Entropy title: Shannon Entropy
description: '- Objective: Explain Shannon Entropy as a measure of uncertainty and description: '- Objective: Explain Shannon entropy as a measure of uncertainty and
compare high-entropy and low-entropy sources. compare high-entropy and low-entropy sources.
- Exercise: Compute the entropy of a Bernoulli source and interpret the result. - Exercise: Compute the entropy of a Bernoulli source and interpret the result.
This lesson centers Shannon Entropy, Surprise' The course then introduces entropy as a quant'
prerequisites: prerequisites:
- counting-and-probability - counting-and-probability
mastery_signals: [] mastery_signals:
mastery_profile: {} - Explain Shannon entropy as a measure of uncertainty and compare high-entropy and
- id: shannon low-entropy sources.
title: Shannon
description: Candidate concept extracted from lesson 'Shannon Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: compute
title: Compute
description: Candidate concept extracted from lesson 'Shannon Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: bernoulli - id: bernoulli
title: Bernoulli title: Bernoulli
description: Candidate concept extracted from lesson 'Shannon Entropy'. description: Candidate concept extracted from lesson 'Shannon Entropy'.
prerequisites: [] prerequisites: []
mastery_signals: [] mastery_signals:
- Explain Shannon entropy as a measure of uncertainty and compare high-entropy and
low-entropy sources.
mastery_profile: {} mastery_profile: {}
- id: mutual-information - id: mutual-information
title: Mutual Information title: Mutual Information
description: '- Objective: Explain Mutual Information and relate it to dependence description: '- Objective: Explain mutual information and relate it to dependence
between signals. between signals or observations.
- Exercise: Compare independent variables with dependent variables using mutual-information - Exercise: Compare independent variables with dependent variables using mutual-information
reasoning. reasoning.
This lesson introduces Mutual Information, Dependenc' These units ask the learner to under'
prerequisites: prerequisites:
- shannon-entropy - shannon-entropy
mastery_signals: [] mastery_signals:
- Explain mutual information and relate it to dependence between signals or observations.
mastery_profile: {} mastery_profile: {}
- id: mutual - id: source-coding-and-compression
title: Mutual title: Source Coding and Compression
description: Candidate concept extracted from lesson 'Mutual Information'. description: '- Objective: Explain lossless compression in terms of entropy, redundancy,
prerequisites: [] and coding choices.
mastery_signals: []
mastery_profile: {}
- id: compare
title: Compare
description: Candidate concept extracted from lesson 'Mutual Information'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: dependence
title: Dependence
description: Candidate concept extracted from lesson 'Mutual Information'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: data-compression
title: Data Compression
description: '- Objective: Explain lossless compression in terms of entropy and
typical structure.
- Exercise: Describe when compression succeeds and when it fails on already-random - Exercise: Describe when compression succeeds and when it fails on already-random
data. data.
This lesson covers Data Compression, Redundancy, and Efficient Rep' The course develops the idea that structured sources can'
prerequisites: prerequisites:
- mutual-information - mutual-information
mastery_signals: [] mastery_signals:
mastery_profile: {} - Explain lossless compression in terms of entropy, redundancy, and coding choices.
- id: data
title: Data
description: Candidate concept extracted from lesson 'Data Compression'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: compression
title: Compression
description: Candidate concept extracted from lesson 'Data Compression'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: describe
title: Describe
description: Candidate concept extracted from lesson 'Data Compression'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: redundancy
title: Redundancy
description: Candidate concept extracted from lesson 'Data Compression'.
prerequisites: []
mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: huffman-coding - id: huffman-coding
title: Huffman Coding title: Huffman Coding
description: '- Objective: Explain Huffman Coding and justify why shorter codewords description: '- Objective: Explain Huffman coding and justify why likely symbols
should track more likely symbols. receive shorter descriptions.
- Exercise: Build a Huffman code for a small source alphabet. - Exercise: Build a Huffman code for a small source alphabet.
This lesson focuses on Huffman Coding, Prefix Codes, and Expected Length.' Learners use trees and expected length arguments to connect probability models
to'
prerequisites: prerequisites:
- data-compression - source-coding-and-compression
mastery_signals: [] mastery_signals:
mastery_profile: {} - Explain Huffman coding and justify why likely symbols receive shorter descriptions.
- id: huffman
title: Huffman
description: Candidate concept extracted from lesson 'Huffman Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: coding
title: Coding
description: Candidate concept extracted from lesson 'Huffman Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: build
title: Build
description: Candidate concept extracted from lesson 'Huffman Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: prefix
title: Prefix
description: Candidate concept extracted from lesson 'Huffman Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: channel-capacity - id: channel-capacity
title: Channel Capacity title: Channel Capacity
description: '- Objective: Explain Channel Capacity as a limit on reliable communication description: '- Objective: Explain channel capacity as a limit on reliable communication
over noisy channels. over a noisy channel.
- Exercise: State why reliable transmission above capacity is impossible in the - Exercise: State why reliable transmission above capacity is impossible in the
long run. long run.
This lesson develops Channel Capacity, Reliable Commun' The course treats capacity as a fundamental upper bou'
prerequisites: prerequisites:
- huffman-coding - huffman-coding
mastery_signals: [] mastery_signals:
mastery_profile: {} - Explain channel capacity as a limit on reliable communication over a noisy channel.
- id: channel
title: Channel
description: Candidate concept extracted from lesson 'Channel Capacity'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: capacity
title: Capacity
description: Candidate concept extracted from lesson 'Channel Capacity'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: state
title: State
description: Candidate concept extracted from lesson 'Channel Capacity'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: reliable
title: Reliable
description: Candidate concept extracted from lesson 'Channel Capacity'.
prerequisites: []
mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: channel-coding - id: channel-coding
title: Channel Coding title: Channel Coding
description: '- Objective: Explain how Channel Coding adds structure that protects description: '- Objective: Explain how channel coding adds redundancy to protect
messages against noise. messages from noise.
- Exercise: Contrast uncoded transmission with coded transmission on a noisy channel. - Exercise: Contrast uncoded transmission with coded transmission on a noisy channel.
This lesson connects Channel Coding, Decoding, and Reliabilit' These units emphasize that redundancy can be wasteful in compressi'
prerequisites: prerequisites:
- channel-capacity - channel-capacity
mastery_signals: [] mastery_signals:
- Explain how channel coding adds redundancy to protect messages from noise.
mastery_profile: {} mastery_profile: {}
- id: contrast - id: contrast
title: Contrast title: Contrast
description: Candidate concept extracted from lesson 'Channel Coding'. description: Candidate concept extracted from lesson 'Channel Coding'.
prerequisites: [] prerequisites: []
mastery_signals: [] mastery_signals:
mastery_profile: {} - Explain how channel coding adds redundancy to protect messages from noise.
- id: decoding
title: Decoding
description: Candidate concept extracted from lesson 'Channel Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: error-correcting-codes - id: error-correcting-codes
title: Error Correcting Codes title: Error Correcting Codes
description: '- Objective: Explain how Error Correcting Codes detect or correct description: '- Objective: Explain how error-correcting codes detect or repair corrupted
symbol corruption. symbols.
- Exercise: Describe a simple parity-style code and its limits. - Exercise: Describe a simple parity-style code and its limits.
This lesson covers Error Correcting Codes, Parity, and Syndrome-style reasoning. The learner must connect abstract limits to concrete coding mechanisms and understand
The learne' both s'
prerequisites: prerequisites:
- channel-coding - channel-coding
mastery_signals: [] mastery_signals:
mastery_profile: {} - Explain how error-correcting codes detect or repair corrupted symbols.
- id: error
title: Error
description: Candidate concept extracted from lesson 'Error Correcting Codes'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: correcting
title: Correcting
description: Candidate concept extracted from lesson 'Error Correcting Codes'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: codes
title: Codes
description: Candidate concept extracted from lesson 'Error Correcting Codes'.
prerequisites: []
mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: cryptography-and-information-hiding - id: cryptography-and-information-hiding
title: Cryptography and Information Hiding title: Cryptography and Information Hiding
@ -345,24 +342,11 @@ concepts:
- Exercise: Compare a secure scheme with a weak one in terms of revealed information. - Exercise: Compare a secure scheme with a weak one in terms of revealed information.
This lesson combines Cryptography, Information Leakag' The course extends information-theoretic reasoning to'
prerequisites: prerequisites:
- error-correcting-codes - error-correcting-codes
mastery_signals: [] mastery_signals:
mastery_profile: {} - Explain the relationship between secrecy, information leakage, and coded communication.
- id: cryptography
title: Cryptography
description: Candidate concept extracted from lesson 'Cryptography and Information
Hiding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: hiding
title: Hiding
description: Candidate concept extracted from lesson 'Cryptography and Information
Hiding'.
prerequisites: []
mastery_signals: []
mastery_profile: {} mastery_profile: {}
- id: thermodynamics-and-entropy - id: thermodynamics-and-entropy
title: Thermodynamics and Entropy title: Thermodynamics and Entropy
@ -372,49 +356,37 @@ concepts:
- Exercise: Compare the two entropy notions and identify what is preserved across - Exercise: Compare the two entropy notions and identify what is preserved across
the analogy. the analogy.
This lesson connects Thermodynamics, Entropy, and P' The course uses entropy as a bridge concept between'
prerequisites: prerequisites:
- cryptography-and-information-hiding - cryptography-and-information-hiding
mastery_signals: [] mastery_signals:
- Explain how thermodynamic entropy relates to, and differs from, Shannon entropy.
mastery_profile: {} mastery_profile: {}
- id: thermodynamics - id: reversible-computation-and-quantum-computation
title: Thermodynamics title: Reversible Computation and Quantum Computation
description: Candidate concept extracted from lesson 'Thermodynamics and Entropy'. description: '- Objective: Explain why the physical implementation of computation
prerequisites: [] matters for information processing limits.
mastery_signals: []
- Exercise: Summarize how reversible computation changes the discussion of dissipation
and information loss.
Later units connect'
prerequisites:
- thermodynamics-and-entropy
mastery_signals:
- Explain why the physical implementation of computation matters for information
processing limits.
mastery_profile: {} mastery_profile: {}
- id: course-synthesis - id: course-synthesis
title: Course Synthesis title: Course Synthesis
description: '- Objective: Synthesize the course by connecting entropy, coding, description: '- Objective: Synthesize the course by connecting entropy, coding,
reliability, and physical interpretation in one coherent narrative. reliability, secrecy, and physical interpretation in one coherent narrative.
- Exercise: Produce a final study guide that links source coding, channel coding, - Exercise: Produce a final study guide that links source coding, channel coding,
secrecy, and thermodynam' secrecy, thermo'
prerequisites: prerequisites:
- thermodynamics-and-entropy - reversible-computation-and-quantum-computation
mastery_signals: [] mastery_signals:
mastery_profile: {} - Synthesize the course by connecting entropy, coding, reliability, secrecy, and
- id: course physical interpretation in one coherent narrative.
title: Course
description: Candidate concept extracted from lesson 'Course Synthesis'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: synthesis
title: Synthesis
description: Candidate concept extracted from lesson 'Course Synthesis'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: synthesize
title: Synthesize
description: Candidate concept extracted from lesson 'Course Synthesis'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: produce
title: Produce
description: Candidate concept extracted from lesson 'Course Synthesis'.
prerequisites: []
mastery_signals: []
mastery_profile: {} mastery_profile: {}

View File

@ -2,9 +2,19 @@
"rights_note": "Derived from MIT OpenCourseWare 6.050J Information and Entropy (Spring 2008). Retain MIT OCW attribution and applicable Creative Commons terms before redistribution.", "rights_note": "Derived from MIT OpenCourseWare 6.050J Information and Entropy (Spring 2008). Retain MIT OCW attribution and applicable Creative Commons terms before redistribution.",
"sources": [ "sources": [
{ {
"source_path": "/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy/6-050j-information-and-entropy.md", "source_path": "examples/ocw-information-entropy/course/course-home.md",
"source_type": "markdown", "source_type": "markdown",
"title": "6 050J Information And Entropy" "title": "Course Home"
},
{
"source_path": "examples/ocw-information-entropy/course/syllabus.md",
"source_type": "markdown",
"title": "Syllabus"
},
{
"source_path": "examples/ocw-information-entropy/course/unit-sequence.md",
"source_type": "markdown",
"title": "Unit Sequence"
} }
] ]
} }

View File

@ -12,3 +12,5 @@ dependencies: []
overrides: [] overrides: []
profile_templates: {} profile_templates: {}
cross_pack_links: [] cross_pack_links: []
supporting_artifacts:
- source_corpus.json

View File

@ -3,6 +3,7 @@
"display_name": "MIT OCW Information and Entropy", "display_name": "MIT OCW Information and Entropy",
"derived_from_sources": [ "derived_from_sources": [
"mit-ocw-6-050j-course-home", "mit-ocw-6-050j-course-home",
"mit-ocw-6-050j-syllabus",
"mit-ocw-6-050j-unit-8-textbook", "mit-ocw-6-050j-unit-8-textbook",
"mit-ocw-6-050j-unit-13-textbook" "mit-ocw-6-050j-unit-13-textbook"
], ],

View File

@ -1,59 +1,5 @@
# Review Report # Review Report
- Module 'Imported from MARKDOWN' has no explicit exercises; mastery signals may be weak. - Concept 'MIT OCW 6.050J Information and Entropy: Course Home' has no extracted mastery signals; review manually.
- Concept 'MIT OCW 6.050J Information and Entropy' has no extracted mastery signals; review manually. - Concept 'MIT OCW 6.050J Information and Entropy: Syllabus' has no extracted mastery signals; review manually.
- Concept 'Information' has no extracted mastery signals; review manually. - Concept 'MIT OCW 6.050J Information and Entropy: Unit Sequence' has no extracted mastery signals; review manually.
- Concept 'Entropy' has no extracted mastery signals; review manually.
- Concept 'Source' has no extracted mastery signals; review manually.
- Concept 'OpenCourseWare' has no extracted mastery signals; review manually.
- Concept 'Spring' has no extracted mastery signals; review manually.
- Concept 'Attribution' has no extracted mastery signals; review manually.
- Concept 'Counting and Probability' has no extracted mastery signals; review manually.
- Concept 'Counting' has no extracted mastery signals; review manually.
- Concept 'Probability' has no extracted mastery signals; review manually.
- Concept 'Objective' has no extracted mastery signals; review manually.
- Concept 'Explain' has no extracted mastery signals; review manually.
- Concept 'Exercise' has no extracted mastery signals; review manually.
- Concept 'Derive' has no extracted mastery signals; review manually.
- Concept 'This' has no extracted mastery signals; review manually.
- Concept 'Random' has no extracted mastery signals; review manually.
- Concept 'Shannon Entropy' has no extracted mastery signals; review manually.
- Concept 'Shannon' has no extracted mastery signals; review manually.
- Concept 'Compute' has no extracted mastery signals; review manually.
- Concept 'Bernoulli' has no extracted mastery signals; review manually.
- Concept 'Mutual Information' has no extracted mastery signals; review manually.
- Concept 'Mutual' has no extracted mastery signals; review manually.
- Concept 'Compare' has no extracted mastery signals; review manually.
- Concept 'Dependence' has no extracted mastery signals; review manually.
- Concept 'Data Compression' has no extracted mastery signals; review manually.
- Concept 'Data' has no extracted mastery signals; review manually.
- Concept 'Compression' has no extracted mastery signals; review manually.
- Concept 'Describe' has no extracted mastery signals; review manually.
- Concept 'Redundancy' has no extracted mastery signals; review manually.
- Concept 'Huffman Coding' has no extracted mastery signals; review manually.
- Concept 'Huffman' has no extracted mastery signals; review manually.
- Concept 'Coding' has no extracted mastery signals; review manually.
- Concept 'Build' has no extracted mastery signals; review manually.
- Concept 'Prefix' has no extracted mastery signals; review manually.
- Concept 'Channel Capacity' has no extracted mastery signals; review manually.
- Concept 'Channel' has no extracted mastery signals; review manually.
- Concept 'Capacity' has no extracted mastery signals; review manually.
- Concept 'State' has no extracted mastery signals; review manually.
- Concept 'Reliable' has no extracted mastery signals; review manually.
- Concept 'Channel Coding' has no extracted mastery signals; review manually.
- Concept 'Contrast' has no extracted mastery signals; review manually.
- Concept 'Decoding' has no extracted mastery signals; review manually.
- Concept 'Error Correcting Codes' has no extracted mastery signals; review manually.
- Concept 'Error' has no extracted mastery signals; review manually.
- Concept 'Correcting' has no extracted mastery signals; review manually.
- Concept 'Codes' has no extracted mastery signals; review manually.
- Concept 'Cryptography and Information Hiding' has no extracted mastery signals; review manually.
- Concept 'Cryptography' has no extracted mastery signals; review manually.
- Concept 'Hiding' has no extracted mastery signals; review manually.
- Concept 'Thermodynamics and Entropy' has no extracted mastery signals; review manually.
- Concept 'Thermodynamics' has no extracted mastery signals; review manually.
- Concept 'Course Synthesis' has no extracted mastery signals; review manually.
- Concept 'Course' has no extracted mastery signals; review manually.
- Concept 'Synthesis' has no extracted mastery signals; review manually.
- Concept 'Synthesize' has no extracted mastery signals; review manually.
- Concept 'Produce' has no extracted mastery signals; review manually.

View File

@ -2,16 +2,50 @@ stages:
- id: stage-1 - id: stage-1
title: Imported from MARKDOWN title: Imported from MARKDOWN
concepts: concepts:
- mit-ocw-6-050j-information-and-entropy - mit-ocw-6-050j-information-and-entropy-course-home
- information-and-entropy
- ultimate-limits-to-communication-and-computation
- open-textbooks-problem-sets-and-programming-work
- mit-ocw-6-050j-information-and-entropy-syllabus
- prerequisites-and-mathematical-background
- assessment-structure
- course-notes-and-reference-texts
- independent-reasoning-and-careful-comparison
- mit-ocw-6-050j-information-and-entropy-unit-sequence
- counting-and-probability - counting-and-probability
- shannon-entropy - shannon-entropy
- mutual-information - mutual-information
- data-compression - source-coding-and-compression
- huffman-coding - huffman-coding
- channel-capacity - channel-capacity
- channel-coding - channel-coding
- error-correcting-codes - error-correcting-codes
- cryptography-and-information-hiding - cryptography-and-information-hiding
- thermodynamics-and-entropy - thermodynamics-and-entropy
- reversible-computation-and-quantum-computation
- course-synthesis - course-synthesis
checkpoint: [] checkpoint:
- Summarize the course in one paragraph for a prospective learner.
- List the main topic clusters that connect communication, computation, and entropy.
- Explain how these resource types support both conceptual study and practice.
- Decide whether a learner needs review in probability, linear algebra, or signals
before beginning.
- Build a study schedule that alternates reading, derivation, and worked exercises.
- Compare when to use course notes versus outside references for clarification.
- Write a short note distinguishing Shannon entropy, channel capacity, and thermodynamic
entropy.
- Derive a simple counting argument for binary strings and compute an event probability.
- Compute the entropy of a Bernoulli source and interpret the result.
- Compare independent variables with dependent variables using mutual-information
reasoning.
- Describe when compression succeeds and when it fails on already-random data.
- Build a Huffman code for a small source alphabet.
- State why reliable transmission above capacity is impossible in the long run.
- Contrast uncoded transmission with coded transmission on a noisy channel.
- Describe a simple parity-style code and its limits.
- Compare a secure scheme with a weak one in terms of revealed information.
- Compare the two entropy notions and identify what is preserved across the analogy.
- Summarize how reversible computation changes the discussion of dissipation and
information loss.
- Produce a final study guide that links source coding, channel coding, secrecy,
thermodynamic analogies, and computation.

View File

@ -0,0 +1,803 @@
{
"course_title": "MIT OCW Information and Entropy",
"rights_note": "Derived from MIT OpenCourseWare 6.050J Information and Entropy (Spring 2008). Retain MIT OCW attribution and applicable Creative Commons terms before redistribution.",
"sources": [
{
"source_path": "examples/ocw-information-entropy/course/course-home.md",
"source_type": "markdown",
"title": "Course Home",
"metadata": {}
},
{
"source_path": "examples/ocw-information-entropy/course/syllabus.md",
"source_type": "markdown",
"title": "Syllabus",
"metadata": {}
},
{
"source_path": "examples/ocw-information-entropy/course/unit-sequence.md",
"source_type": "markdown",
"title": "Unit Sequence",
"metadata": {}
}
],
"fragments": [
{
"fragment_id": "imported-from-markdown::mit-ocw-6.050j-information-and-entropy:-course-home::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "MIT OCW 6.050J Information and Entropy: Course Home",
"text": "Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/\nAttribution: adapted from the MIT OpenCourseWare course home page for 6.050J Information and Entropy.",
"source_refs": [
"examples/ocw-information-entropy/course/course-home.md"
],
"objectives": [],
"exercises": [],
"key_terms": [
"Information",
"Entropy"
]
},
{
"fragment_id": "imported-from-markdown::information-and-entropy::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Information and Entropy",
"text": "- Objective: Identify the course title, instructors, departments, level, and major topical areas.\n- Exercise: Summarize the course in one paragraph for a prospective learner.\nMIT OpenCourseWare presents 6.050J Information and Entropy as a Spring 2008 undergraduate subject taught by Paul Penfield and Seth Lloyd in Electrical Engineering and Computer Science together with Mechanical Engineering. The catalog framing emphasizes theory of computation, signal processing, and mathematical reasoning about information.",
"source_refs": [
"examples/ocw-information-entropy/course/course-home.md"
],
"objectives": [
"Identify the course title, instructors, departments, level, and major topical areas."
],
"exercises": [
"Summarize the course in one paragraph for a prospective learner."
],
"key_terms": [
"Information",
"Entropy",
"Paul",
"Penfield",
"Seth",
"Lloyd",
"Electrical",
"Engineering"
]
},
{
"fragment_id": "imported-from-markdown::information-and-entropy::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Information and Entropy",
"text": "Identify the course title, instructors, departments, level, and major topical areas.",
"source_refs": [
"examples/ocw-information-entropy/course/course-home.md"
]
},
{
"fragment_id": "imported-from-markdown::information-and-entropy::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Information and Entropy",
"text": "Summarize the course in one paragraph for a prospective learner.",
"source_refs": [
"examples/ocw-information-entropy/course/course-home.md"
]
},
{
"fragment_id": "imported-from-markdown::ultimate-limits-to-communication-and-computation::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Ultimate Limits to Communication and Computation",
"text": "- Objective: Explain the broad intellectual scope of the course.\n- Exercise: List the main topic clusters that connect communication, computation, and entropy.\nThe course examines the ultimate limits to communication and computation with emphasis on the physical nature of information processing. The source description highlights information and computation, digital signals, codes and compression, noise, probability, error correction, reversible and irreversible operations, physics of computation, and quantum computation. Entropy is explicitly connected both to channel capacity and to the second law of thermodynamics.",
"source_refs": [
"examples/ocw-information-entropy/course/course-home.md"
],
"objectives": [
"Explain the broad intellectual scope of the course."
],
"exercises": [
"List the main topic clusters that connect communication, computation, and entropy."
],
"key_terms": [
"Entropy"
]
},
{
"fragment_id": "imported-from-markdown::ultimate-limits-to-communication-and-computation::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Ultimate Limits to Communication and Computation",
"text": "Explain the broad intellectual scope of the course.",
"source_refs": [
"examples/ocw-information-entropy/course/course-home.md"
]
},
{
"fragment_id": "imported-from-markdown::ultimate-limits-to-communication-and-computation::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Ultimate Limits to Communication and Computation",
"text": "List the main topic clusters that connect communication, computation, and entropy.",
"source_refs": [
"examples/ocw-information-entropy/course/course-home.md"
]
},
{
"fragment_id": "imported-from-markdown::open-textbooks,-problem-sets,-and-programming-work::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Open Textbooks, Problem Sets, and Programming Work",
"text": "- Objective: Identify the main kinds of learning resources supplied through the course.\n- Exercise: Explain how these resource types support both conceptual study and practice.\nThe course home lists open textbooks, problem sets, problem set solutions, and programming assignments. A learner using Didactopus should treat these as complementary evidence sources rather than relying on one summary alone.",
"source_refs": [
"examples/ocw-information-entropy/course/course-home.md"
],
"objectives": [
"Identify the main kinds of learning resources supplied through the course."
],
"exercises": [
"Explain how these resource types support both conceptual study and practice."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::open-textbooks,-problem-sets,-and-programming-work::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Open Textbooks, Problem Sets, and Programming Work",
"text": "Identify the main kinds of learning resources supplied through the course.",
"source_refs": [
"examples/ocw-information-entropy/course/course-home.md"
]
},
{
"fragment_id": "imported-from-markdown::open-textbooks,-problem-sets,-and-programming-work::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Open Textbooks, Problem Sets, and Programming Work",
"text": "Explain how these resource types support both conceptual study and practice.",
"source_refs": [
"examples/ocw-information-entropy/course/course-home.md"
]
},
{
"fragment_id": "imported-from-markdown::mit-ocw-6.050j-information-and-entropy:-syllabus::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "MIT OCW 6.050J Information and Entropy: Syllabus",
"text": "Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/\nAttribution: adapted from the MIT OpenCourseWare syllabus page for 6.050J Information and Entropy.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
],
"objectives": [],
"exercises": [],
"key_terms": [
"Information",
"Entropy"
]
},
{
"fragment_id": "imported-from-markdown::prerequisites-and-mathematical-background::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Prerequisites and Mathematical Background",
"text": "- Objective: Explain the mathematical maturity expected by the course.\n- Exercise: Decide whether a learner needs review in probability, linear algebra, or signals before beginning.\nThe syllabus expects a foundation comparable to MIT subjects in calculus and linear algebra, together with comfort in probability, signals, and basic programming. Didactopus should therefore surface prerequisite review when those foundations appear weak.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
],
"objectives": [
"Explain the mathematical maturity expected by the course."
],
"exercises": [
"Decide whether a learner needs review in probability, linear algebra, or signals before beginning."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::prerequisites-and-mathematical-background::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Prerequisites and Mathematical Background",
"text": "Explain the mathematical maturity expected by the course.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
]
},
{
"fragment_id": "imported-from-markdown::prerequisites-and-mathematical-background::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Prerequisites and Mathematical Background",
"text": "Decide whether a learner needs review in probability, linear algebra, or signals before beginning.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
]
},
{
"fragment_id": "imported-from-markdown::assessment-structure::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Assessment Structure",
"text": "- Objective: Identify the role of problem sets, exams, and programming work in the course.\n- Exercise: Build a study schedule that alternates reading, derivation, and worked exercises.\nThe syllabus emphasizes regular problem solving and quantitative reasoning. The course is not only a reading list: learners are expected to derive results, solve structured problems, and connect abstract arguments to implementation-oriented tasks.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
],
"objectives": [
"Identify the role of problem sets, exams, and programming work in the course."
],
"exercises": [
"Build a study schedule that alternates reading, derivation, and worked exercises."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::assessment-structure::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Assessment Structure",
"text": "Identify the role of problem sets, exams, and programming work in the course.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
]
},
{
"fragment_id": "imported-from-markdown::assessment-structure::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Assessment Structure",
"text": "Build a study schedule that alternates reading, derivation, and worked exercises.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
]
},
{
"fragment_id": "imported-from-markdown::course-notes-and-reference-texts::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Course Notes and Reference Texts",
"text": "- Objective: Explain how the course notes and textbook references supply the core conceptual sequence.\n- Exercise: Compare when to use course notes versus outside references for clarification.\nMIT OCW links course notes and textbook-style resources through the syllabus and resource pages. The intended use is cumulative: earlier notes establish counting, probability, and entropy, while later materials expand into coding, noise, secrecy, thermodynamics, and computation.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
],
"objectives": [
"Explain how the course notes and textbook references supply the core conceptual sequence."
],
"exercises": [
"Compare when to use course notes versus outside references for clarification."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::course-notes-and-reference-texts::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Course Notes and Reference Texts",
"text": "Explain how the course notes and textbook references supply the core conceptual sequence.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
]
},
{
"fragment_id": "imported-from-markdown::course-notes-and-reference-texts::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Course Notes and Reference Texts",
"text": "Compare when to use course notes versus outside references for clarification.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
]
},
{
"fragment_id": "imported-from-markdown::independent-reasoning-and-careful-comparison::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Independent Reasoning and Careful Comparison",
"text": "- Objective: Explain why the course requires precise comparison of related but non-identical concepts.\n- Exercise: Write a short note distinguishing Shannon entropy, channel capacity, and thermodynamic entropy.\nThe syllabus framing implies a style of work where analogy is useful but dangerous when used loosely. Learners must compare models carefully, state assumptions, and notice where similar mathematics does not imply identical interpretation.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
],
"objectives": [
"Explain why the course requires precise comparison of related but non-identical concepts."
],
"exercises": [
"Write a short note distinguishing Shannon entropy, channel capacity, and thermodynamic entropy."
],
"key_terms": [
"Shannon",
"Learners"
]
},
{
"fragment_id": "imported-from-markdown::independent-reasoning-and-careful-comparison::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Independent Reasoning and Careful Comparison",
"text": "Explain why the course requires precise comparison of related but non-identical concepts.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
]
},
{
"fragment_id": "imported-from-markdown::independent-reasoning-and-careful-comparison::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Independent Reasoning and Careful Comparison",
"text": "Write a short note distinguishing Shannon entropy, channel capacity, and thermodynamic entropy.",
"source_refs": [
"examples/ocw-information-entropy/course/syllabus.md"
]
},
{
"fragment_id": "imported-from-markdown::mit-ocw-6.050j-information-and-entropy:-unit-sequence::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "MIT OCW 6.050J Information and Entropy: Unit Sequence",
"text": "Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/\nAttribution: adapted from the MIT OpenCourseWare unit progression and resource organization for 6.050J Information and Entropy.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [],
"exercises": [],
"key_terms": [
"Information",
"Entropy"
]
},
{
"fragment_id": "imported-from-markdown::counting-and-probability::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Counting and Probability",
"text": "- Objective: Explain how counting arguments, probability spaces, and random variables support later information-theory results.\n- Exercise: Derive a simple counting argument for binary strings and compute an event probability.\nEarly units establish counting, combinatorics, and probability as the language used to reason about uncertainty, messages, and evidence.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain how counting arguments, probability spaces, and random variables support later information-theory results."
],
"exercises": [
"Derive a simple counting argument for binary strings and compute an event probability."
],
"key_terms": [
"Derive"
]
},
{
"fragment_id": "imported-from-markdown::counting-and-probability::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Counting and Probability",
"text": "Explain how counting arguments, probability spaces, and random variables support later information-theory results.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::counting-and-probability::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Counting and Probability",
"text": "Derive a simple counting argument for binary strings and compute an event probability.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::shannon-entropy::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Shannon Entropy",
"text": "- Objective: Explain Shannon entropy as a measure of uncertainty and compare high-entropy and low-entropy sources.\n- Exercise: Compute the entropy of a Bernoulli source and interpret the result.\nThe course then introduces entropy as a quantitative measure of uncertainty for a source model and uses it to reason about representation cost and surprise.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain Shannon entropy as a measure of uncertainty and compare high-entropy and low-entropy sources."
],
"exercises": [
"Compute the entropy of a Bernoulli source and interpret the result."
],
"key_terms": [
"Shannon",
"Bernoulli"
]
},
{
"fragment_id": "imported-from-markdown::shannon-entropy::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Shannon Entropy",
"text": "Explain Shannon entropy as a measure of uncertainty and compare high-entropy and low-entropy sources.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::shannon-entropy::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Shannon Entropy",
"text": "Compute the entropy of a Bernoulli source and interpret the result.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::mutual-information::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Mutual Information",
"text": "- Objective: Explain mutual information and relate it to dependence between signals or observations.\n- Exercise: Compare independent variables with dependent variables using mutual-information reasoning.\nThese units ask the learner to understand how observation changes uncertainty and what it means for one variable to carry information about another.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain mutual information and relate it to dependence between signals or observations."
],
"exercises": [
"Compare independent variables with dependent variables using mutual-information reasoning."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::mutual-information::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Mutual Information",
"text": "Explain mutual information and relate it to dependence between signals or observations.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::mutual-information::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Mutual Information",
"text": "Compare independent variables with dependent variables using mutual-information reasoning.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::source-coding-and-compression::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Source Coding and Compression",
"text": "- Objective: Explain lossless compression in terms of entropy, redundancy, and coding choices.\n- Exercise: Describe when compression succeeds and when it fails on already-random data.\nThe course develops the idea that structured sources can often be described more efficiently, but only up to limits implied by entropy.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain lossless compression in terms of entropy, redundancy, and coding choices."
],
"exercises": [
"Describe when compression succeeds and when it fails on already-random data."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::source-coding-and-compression::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Source Coding and Compression",
"text": "Explain lossless compression in terms of entropy, redundancy, and coding choices.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::source-coding-and-compression::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Source Coding and Compression",
"text": "Describe when compression succeeds and when it fails on already-random data.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::huffman-coding::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Huffman Coding",
"text": "- Objective: Explain Huffman coding and justify why likely symbols receive shorter descriptions.\n- Exercise: Build a Huffman code for a small source alphabet.\nLearners use trees and expected length arguments to connect probability models to practical code design.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain Huffman coding and justify why likely symbols receive shorter descriptions."
],
"exercises": [
"Build a Huffman code for a small source alphabet."
],
"key_terms": [
"Huffman",
"Learners"
]
},
{
"fragment_id": "imported-from-markdown::huffman-coding::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Huffman Coding",
"text": "Explain Huffman coding and justify why likely symbols receive shorter descriptions.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::huffman-coding::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Huffman Coding",
"text": "Build a Huffman code for a small source alphabet.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::channel-capacity::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Channel Capacity",
"text": "- Objective: Explain channel capacity as a limit on reliable communication over a noisy channel.\n- Exercise: State why reliable transmission above capacity is impossible in the long run.\nThe course treats capacity as a fundamental upper bound and frames noisy communication in terms of rates, inference, and uncertainty reduction.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain channel capacity as a limit on reliable communication over a noisy channel."
],
"exercises": [
"State why reliable transmission above capacity is impossible in the long run."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::channel-capacity::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Channel Capacity",
"text": "Explain channel capacity as a limit on reliable communication over a noisy channel.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::channel-capacity::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Channel Capacity",
"text": "State why reliable transmission above capacity is impossible in the long run.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::channel-coding::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Channel Coding",
"text": "- Objective: Explain how channel coding adds redundancy to protect messages from noise.\n- Exercise: Contrast uncoded transmission with coded transmission on a noisy channel.\nThese units emphasize that redundancy can be wasteful in compression but essential in communication under uncertainty.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain how channel coding adds redundancy to protect messages from noise."
],
"exercises": [
"Contrast uncoded transmission with coded transmission on a noisy channel."
],
"key_terms": [
"Contrast"
]
},
{
"fragment_id": "imported-from-markdown::channel-coding::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Channel Coding",
"text": "Explain how channel coding adds redundancy to protect messages from noise.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::channel-coding::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Channel Coding",
"text": "Contrast uncoded transmission with coded transmission on a noisy channel.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::error-correcting-codes::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Error Correcting Codes",
"text": "- Objective: Explain how error-correcting codes detect or repair corrupted symbols.\n- Exercise: Describe a simple parity-style code and its limits.\nThe learner must connect abstract limits to concrete coding mechanisms and understand both strengths and failure modes.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain how error-correcting codes detect or repair corrupted symbols."
],
"exercises": [
"Describe a simple parity-style code and its limits."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::error-correcting-codes::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Error Correcting Codes",
"text": "Explain how error-correcting codes detect or repair corrupted symbols.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::error-correcting-codes::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Error Correcting Codes",
"text": "Describe a simple parity-style code and its limits.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::cryptography-and-information-hiding::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Cryptography and Information Hiding",
"text": "- Objective: Explain the relationship between secrecy, information leakage, and coded communication.\n- Exercise: Compare a secure scheme with a weak one in terms of revealed information.\nThe course extends information-theoretic reasoning to adversarial settings where controlling what an observer can infer becomes central.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain the relationship between secrecy, information leakage, and coded communication."
],
"exercises": [
"Compare a secure scheme with a weak one in terms of revealed information."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::cryptography-and-information-hiding::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Cryptography and Information Hiding",
"text": "Explain the relationship between secrecy, information leakage, and coded communication.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::cryptography-and-information-hiding::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Cryptography and Information Hiding",
"text": "Compare a secure scheme with a weak one in terms of revealed information.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::thermodynamics-and-entropy::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Thermodynamics and Entropy",
"text": "- Objective: Explain how thermodynamic entropy relates to, and differs from, Shannon entropy.\n- Exercise: Compare the two entropy notions and identify what is preserved across the analogy.\nThe course uses entropy as a bridge concept between communication theory and physics while insisting on careful interpretation.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain how thermodynamic entropy relates to, and differs from, Shannon entropy."
],
"exercises": [
"Compare the two entropy notions and identify what is preserved across the analogy."
],
"key_terms": [
"Shannon"
]
},
{
"fragment_id": "imported-from-markdown::thermodynamics-and-entropy::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Thermodynamics and Entropy",
"text": "Explain how thermodynamic entropy relates to, and differs from, Shannon entropy.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::thermodynamics-and-entropy::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Thermodynamics and Entropy",
"text": "Compare the two entropy notions and identify what is preserved across the analogy.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::reversible-computation-and-quantum-computation::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Reversible Computation and Quantum Computation",
"text": "- Objective: Explain why the physical implementation of computation matters for information processing limits.\n- Exercise: Summarize how reversible computation changes the discussion of dissipation and information loss.\nLater units connect information, entropy, and computation more directly by considering reversible logic, irreversibility, and quantum information themes.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Explain why the physical implementation of computation matters for information processing limits."
],
"exercises": [
"Summarize how reversible computation changes the discussion of dissipation and information loss."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::reversible-computation-and-quantum-computation::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Reversible Computation and Quantum Computation",
"text": "Explain why the physical implementation of computation matters for information processing limits.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::reversible-computation-and-quantum-computation::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Reversible Computation and Quantum Computation",
"text": "Summarize how reversible computation changes the discussion of dissipation and information loss.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::course-synthesis::body",
"kind": "lesson_body",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Course Synthesis",
"text": "- Objective: Synthesize the course by connecting entropy, coding, reliability, secrecy, and physical interpretation in one coherent narrative.\n- Exercise: Produce a final study guide that links source coding, channel coding, secrecy, thermodynamic analogies, and computation.\nThe end of the course asks the learner to unify the mathematical and physical perspectives rather than treating the units as disconnected topics.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
],
"objectives": [
"Synthesize the course by connecting entropy, coding, reliability, secrecy, and physical interpretation in one coherent narrative."
],
"exercises": [
"Produce a final study guide that links source coding, channel coding, secrecy, thermodynamic analogies, and computation."
],
"key_terms": []
},
{
"fragment_id": "imported-from-markdown::course-synthesis::objective-1",
"kind": "objective",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Course Synthesis",
"text": "Synthesize the course by connecting entropy, coding, reliability, secrecy, and physical interpretation in one coherent narrative.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
},
{
"fragment_id": "imported-from-markdown::course-synthesis::exercise-1",
"kind": "exercise",
"module_title": "Imported from MARKDOWN",
"lesson_title": "Course Synthesis",
"text": "Produce a final study guide that links source coding, channel coding, secrecy, thermodynamic analogies, and computation.",
"source_refs": [
"examples/ocw-information-entropy/course/unit-sequence.md"
]
}
]
}

View File

@ -6,7 +6,20 @@ sources:
creator: MIT OpenCourseWare creator: MIT OpenCourseWare
license_id: CC BY-NC-SA 4.0 license_id: CC BY-NC-SA 4.0
license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/ license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/
retrieved_at: "2026-03-14" retrieved_at: "2026-03-16"
adapted: true
attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0.
excluded_from_upstream_license: false
exclusion_notes: ""
- source_id: mit-ocw-6-050j-syllabus
title: MIT OpenCourseWare 6.050J Information and Entropy syllabus
url: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/
publisher: Massachusetts Institute of Technology
creator: MIT OpenCourseWare
license_id: CC BY-NC-SA 4.0
license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/
retrieved_at: "2026-03-16"
adapted: true adapted: true
attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0. attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0.
excluded_from_upstream_license: false excluded_from_upstream_license: false
@ -19,7 +32,7 @@ sources:
creator: MIT OpenCourseWare creator: MIT OpenCourseWare
license_id: CC BY-NC-SA 4.0 license_id: CC BY-NC-SA 4.0
license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/ license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/
retrieved_at: "2026-03-14" retrieved_at: "2026-03-16"
adapted: true adapted: true
attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0. attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0.
excluded_from_upstream_license: false excluded_from_upstream_license: false
@ -32,7 +45,7 @@ sources:
creator: MIT OpenCourseWare creator: MIT OpenCourseWare
license_id: CC BY-NC-SA 4.0 license_id: CC BY-NC-SA 4.0
license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/ license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/
retrieved_at: "2026-03-14" retrieved_at: "2026-03-16"
adapted: true adapted: true
attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0. attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0.
excluded_from_upstream_license: false excluded_from_upstream_license: false

View File

@ -3,9 +3,54 @@
"domain": "MIT OCW Information and Entropy", "domain": "MIT OCW Information and Entropy",
"artifacts": [ "artifacts": [
{ {
"concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy", "concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home",
"artifact_type": "symbolic", "artifact_type": "symbolic",
"artifact_name": "mit-ocw-6-050j-information-and-entropy.md" "artifact_name": "mit-ocw-6-050j-information-and-entropy-course-home.md"
},
{
"concept": "mit-ocw-information-and-entropy::information-and-entropy",
"artifact_type": "symbolic",
"artifact_name": "information-and-entropy.md"
},
{
"concept": "mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation",
"artifact_type": "symbolic",
"artifact_name": "ultimate-limits-to-communication-and-computation.md"
},
{
"concept": "mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work",
"artifact_type": "symbolic",
"artifact_name": "open-textbooks-problem-sets-and-programming-work.md"
},
{
"concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus",
"artifact_type": "symbolic",
"artifact_name": "mit-ocw-6-050j-information-and-entropy-syllabus.md"
},
{
"concept": "mit-ocw-information-and-entropy::prerequisites-and-mathematical-background",
"artifact_type": "symbolic",
"artifact_name": "prerequisites-and-mathematical-background.md"
},
{
"concept": "mit-ocw-information-and-entropy::assessment-structure",
"artifact_type": "symbolic",
"artifact_name": "assessment-structure.md"
},
{
"concept": "mit-ocw-information-and-entropy::course-notes-and-reference-texts",
"artifact_type": "symbolic",
"artifact_name": "course-notes-and-reference-texts.md"
},
{
"concept": "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison",
"artifact_type": "symbolic",
"artifact_name": "independent-reasoning-and-careful-comparison.md"
},
{
"concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence",
"artifact_type": "symbolic",
"artifact_name": "mit-ocw-6-050j-information-and-entropy-unit-sequence.md"
}, },
{ {
"concept": "mit-ocw-information-and-entropy::counting-and-probability", "concept": "mit-ocw-information-and-entropy::counting-and-probability",
@ -23,9 +68,9 @@
"artifact_name": "mutual-information.md" "artifact_name": "mutual-information.md"
}, },
{ {
"concept": "mit-ocw-information-and-entropy::data-compression", "concept": "mit-ocw-information-and-entropy::source-coding-and-compression",
"artifact_type": "symbolic", "artifact_type": "symbolic",
"artifact_name": "data-compression.md" "artifact_name": "source-coding-and-compression.md"
}, },
{ {
"concept": "mit-ocw-information-and-entropy::huffman-coding", "concept": "mit-ocw-information-and-entropy::huffman-coding",

View File

@ -3,24 +3,42 @@
"display_name": "OCW Information Entropy Agent", "display_name": "OCW Information Entropy Agent",
"domain": "MIT OCW Information and Entropy", "domain": "MIT OCW Information and Entropy",
"mastered_concepts": [ "mastered_concepts": [
"mit-ocw-information-and-entropy::assessment-structure",
"mit-ocw-information-and-entropy::channel-capacity", "mit-ocw-information-and-entropy::channel-capacity",
"mit-ocw-information-and-entropy::channel-coding", "mit-ocw-information-and-entropy::channel-coding",
"mit-ocw-information-and-entropy::counting-and-probability", "mit-ocw-information-and-entropy::counting-and-probability",
"mit-ocw-information-and-entropy::course-notes-and-reference-texts",
"mit-ocw-information-and-entropy::cryptography-and-information-hiding", "mit-ocw-information-and-entropy::cryptography-and-information-hiding",
"mit-ocw-information-and-entropy::data-compression",
"mit-ocw-information-and-entropy::error-correcting-codes", "mit-ocw-information-and-entropy::error-correcting-codes",
"mit-ocw-information-and-entropy::huffman-coding", "mit-ocw-information-and-entropy::huffman-coding",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy", "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison",
"mit-ocw-information-and-entropy::information-and-entropy",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence",
"mit-ocw-information-and-entropy::mutual-information", "mit-ocw-information-and-entropy::mutual-information",
"mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work",
"mit-ocw-information-and-entropy::prerequisites-and-mathematical-background",
"mit-ocw-information-and-entropy::shannon-entropy", "mit-ocw-information-and-entropy::shannon-entropy",
"mit-ocw-information-and-entropy::thermodynamics-and-entropy" "mit-ocw-information-and-entropy::source-coding-and-compression",
"mit-ocw-information-and-entropy::thermodynamics-and-entropy",
"mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation"
], ],
"weak_dimensions_by_concept": { "weak_dimensions_by_concept": {
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy": [], "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home": [],
"mit-ocw-information-and-entropy::information-and-entropy": [],
"mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation": [],
"mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work": [],
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus": [],
"mit-ocw-information-and-entropy::prerequisites-and-mathematical-background": [],
"mit-ocw-information-and-entropy::assessment-structure": [],
"mit-ocw-information-and-entropy::course-notes-and-reference-texts": [],
"mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison": [],
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence": [],
"mit-ocw-information-and-entropy::counting-and-probability": [], "mit-ocw-information-and-entropy::counting-and-probability": [],
"mit-ocw-information-and-entropy::shannon-entropy": [], "mit-ocw-information-and-entropy::shannon-entropy": [],
"mit-ocw-information-and-entropy::mutual-information": [], "mit-ocw-information-and-entropy::mutual-information": [],
"mit-ocw-information-and-entropy::data-compression": [], "mit-ocw-information-and-entropy::source-coding-and-compression": [],
"mit-ocw-information-and-entropy::huffman-coding": [], "mit-ocw-information-and-entropy::huffman-coding": [],
"mit-ocw-information-and-entropy::channel-capacity": [], "mit-ocw-information-and-entropy::channel-capacity": [],
"mit-ocw-information-and-entropy::channel-coding": [], "mit-ocw-information-and-entropy::channel-coding": [],
@ -29,7 +47,52 @@
"mit-ocw-information-and-entropy::thermodynamics-and-entropy": [] "mit-ocw-information-and-entropy::thermodynamics-and-entropy": []
}, },
"evaluator_summary_by_concept": { "evaluator_summary_by_concept": {
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy": { "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::information-and-entropy": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::prerequisites-and-mathematical-background": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::assessment-structure": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::course-notes-and-reference-texts": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence": {
"correctness": 0.8400000000000001, "correctness": 0.8400000000000001,
"explanation": 0.85, "explanation": 0.85,
"critique": 0.7999999999999999 "critique": 0.7999999999999999
@ -49,7 +112,7 @@
"explanation": 0.85, "explanation": 0.85,
"critique": 0.7999999999999999 "critique": 0.7999999999999999
}, },
"mit-ocw-information-and-entropy::data-compression": { "mit-ocw-information-and-entropy::source-coding-and-compression": {
"correctness": 0.8400000000000001, "correctness": 0.8400000000000001,
"explanation": 0.85, "explanation": 0.85,
"critique": 0.7999999999999999 "critique": 0.7999999999999999
@ -87,9 +150,54 @@
}, },
"artifacts": [ "artifacts": [
{ {
"concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy", "concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home",
"artifact_type": "symbolic", "artifact_type": "symbolic",
"artifact_name": "mit-ocw-6-050j-information-and-entropy.md" "artifact_name": "mit-ocw-6-050j-information-and-entropy-course-home.md"
},
{
"concept": "mit-ocw-information-and-entropy::information-and-entropy",
"artifact_type": "symbolic",
"artifact_name": "information-and-entropy.md"
},
{
"concept": "mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation",
"artifact_type": "symbolic",
"artifact_name": "ultimate-limits-to-communication-and-computation.md"
},
{
"concept": "mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work",
"artifact_type": "symbolic",
"artifact_name": "open-textbooks-problem-sets-and-programming-work.md"
},
{
"concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus",
"artifact_type": "symbolic",
"artifact_name": "mit-ocw-6-050j-information-and-entropy-syllabus.md"
},
{
"concept": "mit-ocw-information-and-entropy::prerequisites-and-mathematical-background",
"artifact_type": "symbolic",
"artifact_name": "prerequisites-and-mathematical-background.md"
},
{
"concept": "mit-ocw-information-and-entropy::assessment-structure",
"artifact_type": "symbolic",
"artifact_name": "assessment-structure.md"
},
{
"concept": "mit-ocw-information-and-entropy::course-notes-and-reference-texts",
"artifact_type": "symbolic",
"artifact_name": "course-notes-and-reference-texts.md"
},
{
"concept": "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison",
"artifact_type": "symbolic",
"artifact_name": "independent-reasoning-and-careful-comparison.md"
},
{
"concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence",
"artifact_type": "symbolic",
"artifact_name": "mit-ocw-6-050j-information-and-entropy-unit-sequence.md"
}, },
{ {
"concept": "mit-ocw-information-and-entropy::counting-and-probability", "concept": "mit-ocw-information-and-entropy::counting-and-probability",
@ -107,9 +215,9 @@
"artifact_name": "mutual-information.md" "artifact_name": "mutual-information.md"
}, },
{ {
"concept": "mit-ocw-information-and-entropy::data-compression", "concept": "mit-ocw-information-and-entropy::source-coding-and-compression",
"artifact_type": "symbolic", "artifact_type": "symbolic",
"artifact_name": "data-compression.md" "artifact_name": "source-coding-and-compression.md"
}, },
{ {
"concept": "mit-ocw-information-and-entropy::huffman-coding", "concept": "mit-ocw-information-and-entropy::huffman-coding",

View File

@ -4,19 +4,34 @@
- Domain: `MIT OCW Information and Entropy` - Domain: `MIT OCW Information and Entropy`
## Mastered Concepts ## Mastered Concepts
- mit-ocw-information-and-entropy::assessment-structure
- mit-ocw-information-and-entropy::channel-capacity - mit-ocw-information-and-entropy::channel-capacity
- mit-ocw-information-and-entropy::channel-coding - mit-ocw-information-and-entropy::channel-coding
- mit-ocw-information-and-entropy::counting-and-probability - mit-ocw-information-and-entropy::counting-and-probability
- mit-ocw-information-and-entropy::course-notes-and-reference-texts
- mit-ocw-information-and-entropy::cryptography-and-information-hiding - mit-ocw-information-and-entropy::cryptography-and-information-hiding
- mit-ocw-information-and-entropy::data-compression
- mit-ocw-information-and-entropy::error-correcting-codes - mit-ocw-information-and-entropy::error-correcting-codes
- mit-ocw-information-and-entropy::huffman-coding - mit-ocw-information-and-entropy::huffman-coding
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy - mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison
- mit-ocw-information-and-entropy::information-and-entropy
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence
- mit-ocw-information-and-entropy::mutual-information - mit-ocw-information-and-entropy::mutual-information
- mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work
- mit-ocw-information-and-entropy::prerequisites-and-mathematical-background
- mit-ocw-information-and-entropy::shannon-entropy - mit-ocw-information-and-entropy::shannon-entropy
- mit-ocw-information-and-entropy::source-coding-and-compression
- mit-ocw-information-and-entropy::thermodynamics-and-entropy - mit-ocw-information-and-entropy::thermodynamics-and-entropy
- mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation
## Concept Summaries ## Concept Summaries
### mit-ocw-information-and-entropy::assessment-structure
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::channel-capacity ### mit-ocw-information-and-entropy::channel-capacity
- correctness: 0.84 - correctness: 0.84
- critique: 0.80 - critique: 0.80
@ -35,13 +50,13 @@
- explanation: 0.85 - explanation: 0.85
- weak dimensions: none - weak dimensions: none
### mit-ocw-information-and-entropy::cryptography-and-information-hiding ### mit-ocw-information-and-entropy::course-notes-and-reference-texts
- correctness: 0.84 - correctness: 0.84
- critique: 0.80 - critique: 0.80
- explanation: 0.85 - explanation: 0.85
- weak dimensions: none - weak dimensions: none
### mit-ocw-information-and-entropy::data-compression ### mit-ocw-information-and-entropy::cryptography-and-information-hiding
- correctness: 0.84 - correctness: 0.84
- critique: 0.80 - critique: 0.80
- explanation: 0.85 - explanation: 0.85
@ -59,7 +74,31 @@
- explanation: 0.85 - explanation: 0.85
- weak dimensions: none - weak dimensions: none
### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy ### mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::information-and-entropy
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence
- correctness: 0.84 - correctness: 0.84
- critique: 0.80 - critique: 0.80
- explanation: 0.85 - explanation: 0.85
@ -71,24 +110,57 @@
- explanation: 0.85 - explanation: 0.85
- weak dimensions: none - weak dimensions: none
### mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::prerequisites-and-mathematical-background
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::shannon-entropy ### mit-ocw-information-and-entropy::shannon-entropy
- correctness: 0.84 - correctness: 0.84
- critique: 0.80 - critique: 0.80
- explanation: 0.85 - explanation: 0.85
- weak dimensions: none - weak dimensions: none
### mit-ocw-information-and-entropy::source-coding-and-compression
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::thermodynamics-and-entropy ### mit-ocw-information-and-entropy::thermodynamics-and-entropy
- correctness: 0.84 - correctness: 0.84
- critique: 0.80 - critique: 0.80
- explanation: 0.85 - explanation: 0.85
- weak dimensions: none - weak dimensions: none
### mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
## Artifacts ## Artifacts
- mit-ocw-6-050j-information-and-entropy.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy - mit-ocw-6-050j-information-and-entropy-course-home.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home
- information-and-entropy.md (symbolic) for mit-ocw-information-and-entropy::information-and-entropy
- ultimate-limits-to-communication-and-computation.md (symbolic) for mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation
- open-textbooks-problem-sets-and-programming-work.md (symbolic) for mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work
- mit-ocw-6-050j-information-and-entropy-syllabus.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus
- prerequisites-and-mathematical-background.md (symbolic) for mit-ocw-information-and-entropy::prerequisites-and-mathematical-background
- assessment-structure.md (symbolic) for mit-ocw-information-and-entropy::assessment-structure
- course-notes-and-reference-texts.md (symbolic) for mit-ocw-information-and-entropy::course-notes-and-reference-texts
- independent-reasoning-and-careful-comparison.md (symbolic) for mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison
- mit-ocw-6-050j-information-and-entropy-unit-sequence.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence
- counting-and-probability.md (symbolic) for mit-ocw-information-and-entropy::counting-and-probability - counting-and-probability.md (symbolic) for mit-ocw-information-and-entropy::counting-and-probability
- shannon-entropy.md (symbolic) for mit-ocw-information-and-entropy::shannon-entropy - shannon-entropy.md (symbolic) for mit-ocw-information-and-entropy::shannon-entropy
- mutual-information.md (symbolic) for mit-ocw-information-and-entropy::mutual-information - mutual-information.md (symbolic) for mit-ocw-information-and-entropy::mutual-information
- data-compression.md (symbolic) for mit-ocw-information-and-entropy::data-compression - source-coding-and-compression.md (symbolic) for mit-ocw-information-and-entropy::source-coding-and-compression
- huffman-coding.md (symbolic) for mit-ocw-information-and-entropy::huffman-coding - huffman-coding.md (symbolic) for mit-ocw-information-and-entropy::huffman-coding
- channel-capacity.md (symbolic) for mit-ocw-information-and-entropy::channel-capacity - channel-capacity.md (symbolic) for mit-ocw-information-and-entropy::channel-capacity
- channel-coding.md (symbolic) for mit-ocw-information-and-entropy::channel-coding - channel-coding.md (symbolic) for mit-ocw-information-and-entropy::channel-coding

View File

@ -1,75 +1,32 @@
{ {
"course_source": "/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy/6-050j-information-and-entropy.md", "course_source": "examples/ocw-information-entropy/course",
"pack_dir": "/home/netuser/dev/Didactopustry1/domain-packs/mit-ocw-information-entropy", "source_document_count": 3,
"skill_dir": "/home/netuser/dev/Didactopustry1/skills/ocw-information-entropy-agent", "pack_dir": "domain-packs/mit-ocw-information-entropy",
"source_inventory": "/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy/sources.yaml", "skill_dir": "skills/ocw-information-entropy-agent",
"source_inventory": "examples/ocw-information-entropy/sources.yaml",
"review_flags": [ "review_flags": [
"Module 'Imported from MARKDOWN' has no explicit exercises; mastery signals may be weak.", "Concept 'MIT OCW 6.050J Information and Entropy: Course Home' has no extracted mastery signals; review manually.",
"Concept 'MIT OCW 6.050J Information and Entropy' has no extracted mastery signals; review manually.", "Concept 'MIT OCW 6.050J Information and Entropy: Syllabus' has no extracted mastery signals; review manually.",
"Concept 'Information' has no extracted mastery signals; review manually.", "Concept 'MIT OCW 6.050J Information and Entropy: Unit Sequence' has no extracted mastery signals; review manually."
"Concept 'Entropy' has no extracted mastery signals; review manually.",
"Concept 'Source' has no extracted mastery signals; review manually.",
"Concept 'OpenCourseWare' has no extracted mastery signals; review manually.",
"Concept 'Spring' has no extracted mastery signals; review manually.",
"Concept 'Attribution' has no extracted mastery signals; review manually.",
"Concept 'Counting and Probability' has no extracted mastery signals; review manually.",
"Concept 'Counting' has no extracted mastery signals; review manually.",
"Concept 'Probability' has no extracted mastery signals; review manually.",
"Concept 'Objective' has no extracted mastery signals; review manually.",
"Concept 'Explain' has no extracted mastery signals; review manually.",
"Concept 'Exercise' has no extracted mastery signals; review manually.",
"Concept 'Derive' has no extracted mastery signals; review manually.",
"Concept 'This' has no extracted mastery signals; review manually.",
"Concept 'Random' has no extracted mastery signals; review manually.",
"Concept 'Shannon Entropy' has no extracted mastery signals; review manually.",
"Concept 'Shannon' has no extracted mastery signals; review manually.",
"Concept 'Compute' has no extracted mastery signals; review manually.",
"Concept 'Bernoulli' has no extracted mastery signals; review manually.",
"Concept 'Mutual Information' has no extracted mastery signals; review manually.",
"Concept 'Mutual' has no extracted mastery signals; review manually.",
"Concept 'Compare' has no extracted mastery signals; review manually.",
"Concept 'Dependence' has no extracted mastery signals; review manually.",
"Concept 'Data Compression' has no extracted mastery signals; review manually.",
"Concept 'Data' has no extracted mastery signals; review manually.",
"Concept 'Compression' has no extracted mastery signals; review manually.",
"Concept 'Describe' has no extracted mastery signals; review manually.",
"Concept 'Redundancy' has no extracted mastery signals; review manually.",
"Concept 'Huffman Coding' has no extracted mastery signals; review manually.",
"Concept 'Huffman' has no extracted mastery signals; review manually.",
"Concept 'Coding' has no extracted mastery signals; review manually.",
"Concept 'Build' has no extracted mastery signals; review manually.",
"Concept 'Prefix' has no extracted mastery signals; review manually.",
"Concept 'Channel Capacity' has no extracted mastery signals; review manually.",
"Concept 'Channel' has no extracted mastery signals; review manually.",
"Concept 'Capacity' has no extracted mastery signals; review manually.",
"Concept 'State' has no extracted mastery signals; review manually.",
"Concept 'Reliable' has no extracted mastery signals; review manually.",
"Concept 'Channel Coding' has no extracted mastery signals; review manually.",
"Concept 'Contrast' has no extracted mastery signals; review manually.",
"Concept 'Decoding' has no extracted mastery signals; review manually.",
"Concept 'Error Correcting Codes' has no extracted mastery signals; review manually.",
"Concept 'Error' has no extracted mastery signals; review manually.",
"Concept 'Correcting' has no extracted mastery signals; review manually.",
"Concept 'Codes' has no extracted mastery signals; review manually.",
"Concept 'Cryptography and Information Hiding' has no extracted mastery signals; review manually.",
"Concept 'Cryptography' has no extracted mastery signals; review manually.",
"Concept 'Hiding' has no extracted mastery signals; review manually.",
"Concept 'Thermodynamics and Entropy' has no extracted mastery signals; review manually.",
"Concept 'Thermodynamics' has no extracted mastery signals; review manually.",
"Concept 'Course Synthesis' has no extracted mastery signals; review manually.",
"Concept 'Course' has no extracted mastery signals; review manually.",
"Concept 'Synthesis' has no extracted mastery signals; review manually.",
"Concept 'Synthesize' has no extracted mastery signals; review manually.",
"Concept 'Produce' has no extracted mastery signals; review manually."
], ],
"concept_count": 56, "concept_count": 34,
"source_fragment_count": 60,
"target_concept": "mit-ocw-information-and-entropy::thermodynamics-and-entropy", "target_concept": "mit-ocw-information-and-entropy::thermodynamics-and-entropy",
"curriculum_path": [ "curriculum_path": [
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy", "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home",
"mit-ocw-information-and-entropy::information-and-entropy",
"mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation",
"mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus",
"mit-ocw-information-and-entropy::prerequisites-and-mathematical-background",
"mit-ocw-information-and-entropy::assessment-structure",
"mit-ocw-information-and-entropy::course-notes-and-reference-texts",
"mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence",
"mit-ocw-information-and-entropy::counting-and-probability", "mit-ocw-information-and-entropy::counting-and-probability",
"mit-ocw-information-and-entropy::shannon-entropy", "mit-ocw-information-and-entropy::shannon-entropy",
"mit-ocw-information-and-entropy::mutual-information", "mit-ocw-information-and-entropy::mutual-information",
"mit-ocw-information-and-entropy::data-compression", "mit-ocw-information-and-entropy::source-coding-and-compression",
"mit-ocw-information-and-entropy::huffman-coding", "mit-ocw-information-and-entropy::huffman-coding",
"mit-ocw-information-and-entropy::channel-capacity", "mit-ocw-information-and-entropy::channel-capacity",
"mit-ocw-information-and-entropy::channel-coding", "mit-ocw-information-and-entropy::channel-coding",
@ -78,25 +35,35 @@
"mit-ocw-information-and-entropy::thermodynamics-and-entropy" "mit-ocw-information-and-entropy::thermodynamics-and-entropy"
], ],
"mastered_concepts": [ "mastered_concepts": [
"mit-ocw-information-and-entropy::assessment-structure",
"mit-ocw-information-and-entropy::channel-capacity", "mit-ocw-information-and-entropy::channel-capacity",
"mit-ocw-information-and-entropy::channel-coding", "mit-ocw-information-and-entropy::channel-coding",
"mit-ocw-information-and-entropy::counting-and-probability", "mit-ocw-information-and-entropy::counting-and-probability",
"mit-ocw-information-and-entropy::course-notes-and-reference-texts",
"mit-ocw-information-and-entropy::cryptography-and-information-hiding", "mit-ocw-information-and-entropy::cryptography-and-information-hiding",
"mit-ocw-information-and-entropy::data-compression",
"mit-ocw-information-and-entropy::error-correcting-codes", "mit-ocw-information-and-entropy::error-correcting-codes",
"mit-ocw-information-and-entropy::huffman-coding", "mit-ocw-information-and-entropy::huffman-coding",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy", "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison",
"mit-ocw-information-and-entropy::information-and-entropy",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence",
"mit-ocw-information-and-entropy::mutual-information", "mit-ocw-information-and-entropy::mutual-information",
"mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work",
"mit-ocw-information-and-entropy::prerequisites-and-mathematical-background",
"mit-ocw-information-and-entropy::shannon-entropy", "mit-ocw-information-and-entropy::shannon-entropy",
"mit-ocw-information-and-entropy::thermodynamics-and-entropy" "mit-ocw-information-and-entropy::source-coding-and-compression",
"mit-ocw-information-and-entropy::thermodynamics-and-entropy",
"mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation"
], ],
"artifact_count": 11, "artifact_count": 20,
"compliance_manifest": "/home/netuser/dev/Didactopustry1/domain-packs/mit-ocw-information-entropy/pack_compliance_manifest.json", "compliance_manifest": "domain-packs/mit-ocw-information-entropy/pack_compliance_manifest.json",
"compliance": { "compliance": {
"pack_id": "mit-ocw-information-and-entropy", "pack_id": "mit-ocw-information-and-entropy",
"display_name": "MIT OCW Information and Entropy", "display_name": "MIT OCW Information and Entropy",
"derived_from_sources": [ "derived_from_sources": [
"mit-ocw-6-050j-course-home", "mit-ocw-6-050j-course-home",
"mit-ocw-6-050j-syllabus",
"mit-ocw-6-050j-unit-8-textbook", "mit-ocw-6-050j-unit-8-textbook",
"mit-ocw-6-050j-unit-13-textbook" "mit-ocw-6-050j-unit-13-textbook"
], ],

View File

@ -4,19 +4,34 @@
- Domain: `MIT OCW Information and Entropy` - Domain: `MIT OCW Information and Entropy`
## Mastered Concepts ## Mastered Concepts
- mit-ocw-information-and-entropy::assessment-structure
- mit-ocw-information-and-entropy::channel-capacity - mit-ocw-information-and-entropy::channel-capacity
- mit-ocw-information-and-entropy::channel-coding - mit-ocw-information-and-entropy::channel-coding
- mit-ocw-information-and-entropy::counting-and-probability - mit-ocw-information-and-entropy::counting-and-probability
- mit-ocw-information-and-entropy::course-notes-and-reference-texts
- mit-ocw-information-and-entropy::cryptography-and-information-hiding - mit-ocw-information-and-entropy::cryptography-and-information-hiding
- mit-ocw-information-and-entropy::data-compression
- mit-ocw-information-and-entropy::error-correcting-codes - mit-ocw-information-and-entropy::error-correcting-codes
- mit-ocw-information-and-entropy::huffman-coding - mit-ocw-information-and-entropy::huffman-coding
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy - mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison
- mit-ocw-information-and-entropy::information-and-entropy
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence
- mit-ocw-information-and-entropy::mutual-information - mit-ocw-information-and-entropy::mutual-information
- mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work
- mit-ocw-information-and-entropy::prerequisites-and-mathematical-background
- mit-ocw-information-and-entropy::shannon-entropy - mit-ocw-information-and-entropy::shannon-entropy
- mit-ocw-information-and-entropy::source-coding-and-compression
- mit-ocw-information-and-entropy::thermodynamics-and-entropy - mit-ocw-information-and-entropy::thermodynamics-and-entropy
- mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation
## Concept Summaries ## Concept Summaries
### mit-ocw-information-and-entropy::assessment-structure
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::channel-capacity ### mit-ocw-information-and-entropy::channel-capacity
- correctness: 0.84 - correctness: 0.84
- critique: 0.80 - critique: 0.80
@ -35,13 +50,13 @@
- explanation: 0.85 - explanation: 0.85
- weak dimensions: none - weak dimensions: none
### mit-ocw-information-and-entropy::cryptography-and-information-hiding ### mit-ocw-information-and-entropy::course-notes-and-reference-texts
- correctness: 0.84 - correctness: 0.84
- critique: 0.80 - critique: 0.80
- explanation: 0.85 - explanation: 0.85
- weak dimensions: none - weak dimensions: none
### mit-ocw-information-and-entropy::data-compression ### mit-ocw-information-and-entropy::cryptography-and-information-hiding
- correctness: 0.84 - correctness: 0.84
- critique: 0.80 - critique: 0.80
- explanation: 0.85 - explanation: 0.85
@ -59,7 +74,31 @@
- explanation: 0.85 - explanation: 0.85
- weak dimensions: none - weak dimensions: none
### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy ### mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::information-and-entropy
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence
- correctness: 0.84 - correctness: 0.84
- critique: 0.80 - critique: 0.80
- explanation: 0.85 - explanation: 0.85
@ -71,24 +110,57 @@
- explanation: 0.85 - explanation: 0.85
- weak dimensions: none - weak dimensions: none
### mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::prerequisites-and-mathematical-background
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::shannon-entropy ### mit-ocw-information-and-entropy::shannon-entropy
- correctness: 0.84 - correctness: 0.84
- critique: 0.80 - critique: 0.80
- explanation: 0.85 - explanation: 0.85
- weak dimensions: none - weak dimensions: none
### mit-ocw-information-and-entropy::source-coding-and-compression
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::thermodynamics-and-entropy ### mit-ocw-information-and-entropy::thermodynamics-and-entropy
- correctness: 0.84 - correctness: 0.84
- critique: 0.80 - critique: 0.80
- explanation: 0.85 - explanation: 0.85
- weak dimensions: none - weak dimensions: none
### mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
## Artifacts ## Artifacts
- mit-ocw-6-050j-information-and-entropy.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy - mit-ocw-6-050j-information-and-entropy-course-home.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home
- information-and-entropy.md (symbolic) for mit-ocw-information-and-entropy::information-and-entropy
- ultimate-limits-to-communication-and-computation.md (symbolic) for mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation
- open-textbooks-problem-sets-and-programming-work.md (symbolic) for mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work
- mit-ocw-6-050j-information-and-entropy-syllabus.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus
- prerequisites-and-mathematical-background.md (symbolic) for mit-ocw-information-and-entropy::prerequisites-and-mathematical-background
- assessment-structure.md (symbolic) for mit-ocw-information-and-entropy::assessment-structure
- course-notes-and-reference-texts.md (symbolic) for mit-ocw-information-and-entropy::course-notes-and-reference-texts
- independent-reasoning-and-careful-comparison.md (symbolic) for mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison
- mit-ocw-6-050j-information-and-entropy-unit-sequence.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence
- counting-and-probability.md (symbolic) for mit-ocw-information-and-entropy::counting-and-probability - counting-and-probability.md (symbolic) for mit-ocw-information-and-entropy::counting-and-probability
- shannon-entropy.md (symbolic) for mit-ocw-information-and-entropy::shannon-entropy - shannon-entropy.md (symbolic) for mit-ocw-information-and-entropy::shannon-entropy
- mutual-information.md (symbolic) for mit-ocw-information-and-entropy::mutual-information - mutual-information.md (symbolic) for mit-ocw-information-and-entropy::mutual-information
- data-compression.md (symbolic) for mit-ocw-information-and-entropy::data-compression - source-coding-and-compression.md (symbolic) for mit-ocw-information-and-entropy::source-coding-and-compression
- huffman-coding.md (symbolic) for mit-ocw-information-and-entropy::huffman-coding - huffman-coding.md (symbolic) for mit-ocw-information-and-entropy::huffman-coding
- channel-capacity.md (symbolic) for mit-ocw-information-and-entropy::channel-capacity - channel-capacity.md (symbolic) for mit-ocw-information-and-entropy::channel-capacity
- channel-coding.md (symbolic) for mit-ocw-information-and-entropy::channel-coding - channel-coding.md (symbolic) for mit-ocw-information-and-entropy::channel-coding

View File

@ -1,14 +1,23 @@
# Generated Course Summary # Generated Course Summary
- Pack dir: `/home/netuser/dev/Didactopustry1/domain-packs/mit-ocw-information-entropy` - Pack dir: `domain-packs/mit-ocw-information-entropy`
- Run dir: `/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy-run` - Run dir: `examples/ocw-information-entropy-run`
## Curriculum Path Used By The Demo Learner ## Curriculum Path Used By The Demo Learner
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy - mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home
- mit-ocw-information-and-entropy::information-and-entropy
- mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation
- mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus
- mit-ocw-information-and-entropy::prerequisites-and-mathematical-background
- mit-ocw-information-and-entropy::assessment-structure
- mit-ocw-information-and-entropy::course-notes-and-reference-texts
- mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence
- mit-ocw-information-and-entropy::counting-and-probability - mit-ocw-information-and-entropy::counting-and-probability
- mit-ocw-information-and-entropy::shannon-entropy - mit-ocw-information-and-entropy::shannon-entropy
- mit-ocw-information-and-entropy::mutual-information - mit-ocw-information-and-entropy::mutual-information
- mit-ocw-information-and-entropy::data-compression - mit-ocw-information-and-entropy::source-coding-and-compression
- mit-ocw-information-and-entropy::huffman-coding - mit-ocw-information-and-entropy::huffman-coding
- mit-ocw-information-and-entropy::channel-capacity - mit-ocw-information-and-entropy::channel-capacity
- mit-ocw-information-and-entropy::channel-coding - mit-ocw-information-and-entropy::channel-coding
@ -17,14 +26,23 @@
- mit-ocw-information-and-entropy::thermodynamics-and-entropy - mit-ocw-information-and-entropy::thermodynamics-and-entropy
## Mastered Concepts ## Mastered Concepts
- mit-ocw-information-and-entropy::assessment-structure
- mit-ocw-information-and-entropy::channel-capacity - mit-ocw-information-and-entropy::channel-capacity
- mit-ocw-information-and-entropy::channel-coding - mit-ocw-information-and-entropy::channel-coding
- mit-ocw-information-and-entropy::counting-and-probability - mit-ocw-information-and-entropy::counting-and-probability
- mit-ocw-information-and-entropy::course-notes-and-reference-texts
- mit-ocw-information-and-entropy::cryptography-and-information-hiding - mit-ocw-information-and-entropy::cryptography-and-information-hiding
- mit-ocw-information-and-entropy::data-compression
- mit-ocw-information-and-entropy::error-correcting-codes - mit-ocw-information-and-entropy::error-correcting-codes
- mit-ocw-information-and-entropy::huffman-coding - mit-ocw-information-and-entropy::huffman-coding
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy - mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison
- mit-ocw-information-and-entropy::information-and-entropy
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence
- mit-ocw-information-and-entropy::mutual-information - mit-ocw-information-and-entropy::mutual-information
- mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work
- mit-ocw-information-and-entropy::prerequisites-and-mathematical-background
- mit-ocw-information-and-entropy::shannon-entropy - mit-ocw-information-and-entropy::shannon-entropy
- mit-ocw-information-and-entropy::source-coding-and-compression
- mit-ocw-information-and-entropy::thermodynamics-and-entropy - mit-ocw-information-and-entropy::thermodynamics-and-entropy
- mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation

View File

@ -45,10 +45,38 @@ class PlatformConfig(BaseModel):
return self.dimension_thresholds return self.dimension_thresholds
class LocalProviderConfig(BaseModel):
backend: str = "stub"
model_name: str = "local-demo"
class RoleMeshProviderConfig(BaseModel):
base_url: str = os.getenv("DIDACTOPUS_ROLEMESH_BASE_URL", "http://127.0.0.1:8000")
api_key: str = os.getenv("DIDACTOPUS_ROLEMESH_API_KEY", "")
default_model: str = "planner"
role_to_model: dict[str, str] = Field(
default_factory=lambda: {
"mentor": "planner",
"learner": "writer",
"practice": "writer",
"project_advisor": "planner",
"evaluator": "reviewer",
}
)
timeout_seconds: float = 30.0
class ModelProviderConfig(BaseModel):
provider: str = "stub"
local: LocalProviderConfig = Field(default_factory=LocalProviderConfig)
rolemesh: RoleMeshProviderConfig = Field(default_factory=RoleMeshProviderConfig)
class AppConfig(BaseModel): class AppConfig(BaseModel):
review: ReviewConfig = Field(default_factory=ReviewConfig) review: ReviewConfig = Field(default_factory=ReviewConfig)
bridge: BridgeConfig = Field(default_factory=BridgeConfig) bridge: BridgeConfig = Field(default_factory=BridgeConfig)
platform: PlatformConfig = Field(default_factory=PlatformConfig) platform: PlatformConfig = Field(default_factory=PlatformConfig)
model_provider: ModelProviderConfig = Field(default_factory=ModelProviderConfig)
def load_settings() -> Settings: def load_settings() -> Settings:
@ -64,4 +92,6 @@ def _with_platform_defaults(data: dict[str, Any]) -> dict[str, Any]:
normalized = dict(data) normalized = dict(data)
if "platform" not in normalized: if "platform" not in normalized:
normalized["platform"] = {} normalized["platform"] = {}
if "model_provider" not in normalized:
normalized["model_provider"] = {}
return normalized return normalized

View File

@ -126,6 +126,19 @@ def detect_adapter(path: str | Path) -> str:
return "text" return "text"
def is_supported_document(path: str | Path) -> bool:
p = Path(path)
return p.is_file() and detect_adapter(p) in {"markdown", "text", "html", "pdf", "docx", "pptx"}
def adapt_documents(path: str | Path) -> list[NormalizedDocument]:
p = Path(path)
if p.is_dir():
docs = [adapt_document(child) for child in sorted(p.rglob("*")) if is_supported_document(child)]
return docs
return [adapt_document(p)]
def adapt_document(path: str | Path) -> NormalizedDocument: def adapt_document(path: str | Path) -> NormalizedDocument:
adapter = detect_adapter(path) adapter = detect_adapter(path)
if adapter == "markdown": if adapter == "markdown":

View File

@ -1,4 +1,5 @@
from .model_provider import ModelProvider from .model_provider import ModelProvider
from .role_prompts import mentor_system_prompt
def generate_socratic_prompt(provider: ModelProvider, concept: str, weak_dimensions: list[str] | None = None) -> str: def generate_socratic_prompt(provider: ModelProvider, concept: str, weak_dimensions: list[str] | None = None) -> str:
@ -6,5 +7,7 @@ def generate_socratic_prompt(provider: ModelProvider, concept: str, weak_dimensi
if weak_dimensions: if weak_dimensions:
weak_text = f" Focus especially on weak dimensions: {', '.join(weak_dimensions)}." weak_text = f" Focus especially on weak dimensions: {', '.join(weak_dimensions)}."
return provider.generate( return provider.generate(
f"You are a Socratic mentor. Ask one probing question about '{concept}'.{weak_text}" f"Ask one probing question about '{concept}'.{weak_text}",
role="mentor",
system_prompt=mentor_system_prompt(),
).text ).text

View File

@ -1,4 +1,10 @@
from __future__ import annotations
from dataclasses import dataclass from dataclasses import dataclass
import json
from typing import Callable
from urllib import request
from .config import ModelProviderConfig from .config import ModelProviderConfig
@ -13,11 +19,92 @@ class ModelProvider:
def __init__(self, config: ModelProviderConfig) -> None: def __init__(self, config: ModelProviderConfig) -> None:
self.config = config self.config = config
def generate(self, prompt: str) -> ModelResponse: def pending_notice(self, role: str | None, model_name: str | None = None) -> str:
label = role or "assistant"
notices = {
"mentor": "Didactopus is reviewing the next learning step before answering.",
"learner": "Didactopus is drafting the learner-side reflection now.",
"practice": "Didactopus is designing a practice task for you now.",
"project_advisor": "Didactopus is sketching a project direction now.",
"evaluator": "Didactopus is evaluating the work before replying.",
}
notice = notices.get(label, "Didactopus is preparing the next response.")
if model_name:
return f"{notice} Model: {model_name}."
return notice
def generate(
self,
prompt: str,
role: str | None = None,
system_prompt: str | None = None,
temperature: float | None = None,
max_tokens: int | None = None,
status_callback: Callable[[str], None] | None = None,
) -> ModelResponse:
provider_name = self.config.provider.lower()
if provider_name == "rolemesh":
return self._generate_rolemesh(prompt, role, system_prompt, temperature, max_tokens, status_callback)
return self._generate_stub(prompt, role)
def _generate_stub(self, prompt: str, role: str | None) -> ModelResponse:
local = self.config.local local = self.config.local
preview = prompt.strip().replace("\n", " ")[:120] preview = prompt.strip().replace("\n", " ")[:120]
role_text = f"[{role}] " if role else ""
return ModelResponse( return ModelResponse(
text=f"[stubbed-response] {preview}", text=f"[stubbed-response] {role_text}{preview}",
provider=local.backend, provider=local.backend,
model_name=local.model_name, model_name=local.model_name,
) )
def _generate_rolemesh(
self,
prompt: str,
role: str | None,
system_prompt: str | None,
temperature: float | None,
max_tokens: int | None,
status_callback: Callable[[str], None] | None,
) -> ModelResponse:
rolemesh = self.config.rolemesh
model_name = rolemesh.role_to_model.get(role or "", rolemesh.default_model)
if status_callback is not None:
status_callback(self.pending_notice(role, model_name))
messages = []
if system_prompt:
messages.append({"role": "system", "content": system_prompt})
messages.append({"role": "user", "content": prompt})
payload = {
"model": model_name,
"messages": messages,
}
if temperature is not None:
payload["temperature"] = temperature
if max_tokens is not None:
payload["max_tokens"] = max_tokens
body = self._rolemesh_chat_completion(payload)
choices = body.get("choices", [])
if not choices:
raise RuntimeError("RoleMesh returned no choices.")
message = choices[0].get("message", {})
text = message.get("content", "")
if not isinstance(text, str):
text = str(text)
return ModelResponse(text=text, provider="rolemesh", model_name=model_name)
def _rolemesh_chat_completion(self, payload: dict) -> dict:
rolemesh = self.config.rolemesh
url = rolemesh.base_url.rstrip("/") + "/v1/chat/completions"
headers = {
"Content-Type": "application/json",
}
if rolemesh.api_key:
headers["X-Api-Key"] = rolemesh.api_key
req = request.Request(
url,
data=json.dumps(payload).encode("utf-8"),
headers=headers,
method="POST",
)
with request.urlopen(req, timeout=rolemesh.timeout_seconds) as response:
return json.loads(response.read().decode("utf-8"))

View File

@ -6,7 +6,7 @@ from pathlib import Path
from .agentic_loop import AgenticStudentState, integrate_attempt from .agentic_loop import AgenticStudentState, integrate_attempt
from .artifact_registry import validate_pack from .artifact_registry import validate_pack
from .course_ingestion_compliance import build_pack_compliance_manifest, load_sources, write_manifest from .course_ingestion_compliance import build_pack_compliance_manifest, load_sources, write_manifest
from .document_adapters import adapt_document from .document_adapters import adapt_documents
from .evaluator_pipeline import LearnerAttempt from .evaluator_pipeline import LearnerAttempt
from .graph_builder import build_concept_graph from .graph_builder import build_concept_graph
from .mastery_ledger import ( from .mastery_ledger import (
@ -15,7 +15,7 @@ from .mastery_ledger import (
export_capability_profile_json, export_capability_profile_json,
export_capability_report_markdown, export_capability_report_markdown,
) )
from .pack_emitter import build_draft_pack, write_draft_pack from .pack_emitter import build_draft_pack, write_draft_pack, write_source_corpus
from .rule_policy import RuleContext, build_default_rules, run_rules from .rule_policy import RuleContext, build_default_rules, run_rules
from .topic_ingest import build_topic_bundle, document_to_course, extract_concept_candidates, merge_courses_into_topic_course from .topic_ingest import build_topic_bundle, document_to_course, extract_concept_candidates, merge_courses_into_topic_course
@ -38,8 +38,9 @@ Use this skill when the task is about tutoring, evaluating, or planning study in
1. Read `references/generated-course-summary.md` for the pack structure and target concepts. 1. Read `references/generated-course-summary.md` for the pack structure and target concepts.
2. Read `references/generated-capability-summary.md` to understand what the demo AI learner already mastered. 2. Read `references/generated-capability-summary.md` to understand what the demo AI learner already mastered.
3. Use `assets/generated/pack/` as the source of truth for concept ids, prerequisites, and mastery signals. 3. Use `assets/generated/pack/` as the source of truth for concept ids, prerequisites, and mastery signals.
4. When giving guidance, preserve the pack ordering from fundamentals through coding and thermodynamics. 4. Use `assets/generated/pack/source_corpus.json` to ground explanations in the ingested source material before relying on model prior knowledge.
5. When uncertain, say which concept or prerequisite in the generated pack is underspecified. 5. When giving guidance, preserve the pack ordering from fundamentals through coding and thermodynamics.
6. When uncertain, say which concept or prerequisite in the generated pack is underspecified and which source fragment would need review.
## Outputs ## Outputs
@ -122,6 +123,15 @@ def _write_skill_bundle(
(run_asset_dir / source.name).write_text(source.read_text(encoding="utf-8"), encoding="utf-8") (run_asset_dir / source.name).write_text(source.read_text(encoding="utf-8"), encoding="utf-8")
def _select_target_concept(pack_name: str, concepts: list, preferred_id: str = "thermodynamics-and-entropy") -> str:
ids = [concept.id for concept in concepts]
if preferred_id in ids:
return f"{pack_name}::{preferred_id}"
if not ids:
raise ValueError("No concept candidates available to select as target.")
return f"{pack_name}::{ids[-1]}"
def run_ocw_information_entropy_demo( def run_ocw_information_entropy_demo(
course_source: str | Path, course_source: str | Path,
source_inventory: str | Path, source_inventory: str | Path,
@ -135,9 +145,11 @@ def run_ocw_information_entropy_demo(
run_dir = Path(run_dir) run_dir = Path(run_dir)
skill_dir = Path(skill_dir) skill_dir = Path(skill_dir)
doc = adapt_document(course_source) docs = adapt_documents(course_source)
course = document_to_course(doc, "MIT OCW Information and Entropy") if not docs:
merged = merge_courses_into_topic_course(build_topic_bundle(course.title, [course])) raise ValueError(f"No supported source documents found under {course_source}")
courses = [document_to_course(doc, "MIT OCW Information and Entropy") for doc in docs]
merged = merge_courses_into_topic_course(build_topic_bundle("MIT OCW Information and Entropy", courses))
merged.rights_note = DEFAULT_RIGHTS_NOTE merged.rights_note = DEFAULT_RIGHTS_NOTE
concepts = extract_concept_candidates(merged) concepts = extract_concept_candidates(merged)
@ -153,6 +165,7 @@ def run_ocw_information_entropy_demo(
conflicts=[], conflicts=[],
) )
write_draft_pack(draft, pack_dir) write_draft_pack(draft, pack_dir)
write_source_corpus(merged, pack_dir)
if source_inventory.exists(): if source_inventory.exists():
inventory = load_sources(source_inventory) inventory = load_sources(source_inventory)
compliance_manifest = build_pack_compliance_manifest(draft.pack["name"], draft.pack["display_name"], inventory) compliance_manifest = build_pack_compliance_manifest(draft.pack["name"], draft.pack["display_name"], inventory)
@ -170,7 +183,7 @@ def run_ocw_information_entropy_demo(
"project_execution": 0.75, "project_execution": 0.75,
"critique": 0.7, "critique": 0.7,
}) })
target_key = f"{draft.pack['name']}::thermodynamics-and-entropy" target_key = _select_target_concept(draft.pack["name"], ctx.concepts)
concept_path = graph.curriculum_path_to_target(set(), target_key) concept_path = graph.curriculum_path_to_target(set(), target_key)
state = AgenticStudentState( state = AgenticStudentState(
@ -189,11 +202,13 @@ def run_ocw_information_entropy_demo(
summary = { summary = {
"course_source": str(course_source), "course_source": str(course_source),
"source_document_count": len(docs),
"pack_dir": str(pack_dir), "pack_dir": str(pack_dir),
"skill_dir": str(skill_dir), "skill_dir": str(skill_dir),
"source_inventory": str(source_inventory), "source_inventory": str(source_inventory),
"review_flags": list(ctx.review_flags), "review_flags": list(ctx.review_flags),
"concept_count": len(ctx.concepts), "concept_count": len(ctx.concepts),
"source_fragment_count": len(json.loads((pack_dir / "source_corpus.json").read_text(encoding="utf-8")).get("fragments", [])),
"target_concept": target_key, "target_concept": target_key,
"curriculum_path": concept_path, "curriculum_path": concept_path,
"mastered_concepts": sorted(state.mastered_concepts), "mastered_concepts": sorted(state.mastered_concepts),
@ -216,7 +231,7 @@ def main() -> None:
parser = argparse.ArgumentParser(description="Generate a domain pack and skill bundle from MIT OCW Information and Entropy.") parser = argparse.ArgumentParser(description="Generate a domain pack and skill bundle from MIT OCW Information and Entropy.")
parser.add_argument( parser.add_argument(
"--course-source", "--course-source",
default=str(root / "examples" / "ocw-information-entropy" / "6-050j-information-and-entropy.md"), default=str(root / "examples" / "ocw-information-entropy" / "course"),
) )
parser.add_argument( parser.add_argument(
"--source-inventory", "--source-inventory",

View File

@ -0,0 +1,451 @@
from __future__ import annotations
import json
from pathlib import Path
import sys
from .config import load_config
from .model_provider import ModelProvider
from .ocw_skill_agent_demo import load_ocw_skill_context
from .role_prompts import evaluator_system_prompt, learner_system_prompt, mentor_system_prompt, practice_system_prompt
def _format_turn(role: str, speaker: str, content: str) -> dict[str, str]:
return {"role": role, "speaker": speaker, "content": content.strip()}
def _normalize_completion(text: str) -> str:
stripped = text.strip()
if not stripped:
return stripped
if _looks_truncated(stripped):
return stripped.rstrip(" ,;:-") + "."
return stripped
def _looks_truncated(text: str) -> bool:
def _ends_with_truncated_marker(line: str) -> bool:
lowered_line = line.lower()
return any(lowered_line.endswith(marker) for marker in truncated_markers)
stripped = text.strip()
if not stripped:
return True
if stripped.endswith(("...", "")):
return True
truncated_markers = (
"for example,",
"for instance,",
"such as",
"this means",
"therefore",
"however",
"furthermore",
"in particular",
"suppose we have",
"which means",
"so the",
"is h =",
"compare the entropy of one roll with the",
"with a crossover",
)
lowered = stripped.lower()
if _ends_with_truncated_marker(lowered):
return True
if stripped[-1] not in ".!?)]}\"'":
return True
lines = [line.strip() for line in stripped.splitlines() if line.strip()]
if len(lines) >= 2:
for idx in range(len(lines) - 1):
current = lines[idx]
nxt = lines[idx + 1]
if nxt[:1].isdigit():
if _ends_with_truncated_marker(current) or current[-1] not in (".", "!", "?", ":", ")"):
return True
tail = stripped.rsplit(None, 1)[-1].lower()
return tail in {
"a",
"an",
"and",
"as",
"because",
"but",
"for",
"if",
"in",
"of",
"or",
"so",
"the",
"to",
"with",
}
def _is_topical(text: str, required_terms: list[str], forbidden_terms: list[str] | None = None) -> bool:
lowered = text.lower()
if forbidden_terms and any(term in lowered for term in forbidden_terms):
return False
return any(term in lowered for term in required_terms)
def _generate_checked(
provider: ModelProvider,
prompt: str,
role: str,
system_prompt: str,
required_terms: list[str],
forbidden_terms: list[str] | None = None,
temperature: float = 0.2,
max_tokens: int = 180,
retries: int = 2,
status_callback=None,
) -> str:
attempt_prompt = prompt
for _ in range(retries + 1):
text = provider.generate(
attempt_prompt,
role=role,
system_prompt=system_prompt,
temperature=temperature,
max_tokens=max_tokens,
status_callback=status_callback,
).text
if _is_topical(text, required_terms, forbidden_terms):
completed = text.strip()
continuation_budget = max(64, max_tokens // 2)
for _continuation_idx in range(3):
if not _looks_truncated(completed):
break
continuation = provider.generate(
"Continue the previous response without restarting it. Finish the thought cleanly and end with a complete sentence.\n\n"
f"Current draft:\n{completed}",
role=role,
system_prompt=system_prompt,
temperature=min(temperature, 0.2),
max_tokens=continuation_budget,
status_callback=status_callback,
).text.strip()
if not continuation or continuation == completed:
break
completed = f"{completed.rstrip()} {continuation.lstrip()}"
if not _looks_truncated(continuation):
break
return _normalize_completion(completed)
attempt_prompt = (
prompt
+ " Stay strictly within information theory, entropy, probability, coding, or thermodynamics. "
+ "Do not switch to politics, programming style, or unrelated topics."
)
return _normalize_completion(text)
def _concept_title_map(context) -> dict[str, str]:
return {concept.get("id", ""): concept.get("title", concept.get("id", "")) for concept in context.concepts}
def _path_titles(context, limit: int | None = None) -> list[str]:
title_map = _concept_title_map(context)
titles: list[str] = []
for concept_key in context.run_summary.get("curriculum_path", []):
concept_id = concept_key.split("::", 1)[-1]
titles.append(title_map.get(concept_id, concept_id.replace("-", " ").title()))
return titles[:limit] if limit is not None else titles
def _healthy_rolemesh_models(provider: ModelProvider) -> set[str]:
config = provider.config
if config.provider.lower() != "rolemesh":
return set()
models = set(config.rolemesh.role_to_model.values()) | {config.rolemesh.default_model}
healthy: set[str] = set()
for model in models:
try:
provider._rolemesh_chat_completion( # type: ignore[attr-defined]
{
"model": model,
"messages": [{"role": "user", "content": "Reply with ok."}],
"max_tokens": 16,
"temperature": 0.0,
}
)
healthy.add(model)
except Exception:
continue
return healthy
def _apply_rolemesh_fallbacks(provider: ModelProvider) -> dict[str, str]:
config = provider.config
if config.provider.lower() != "rolemesh":
return {}
healthy = _healthy_rolemesh_models(provider)
if not healthy:
raise RuntimeError("No healthy RoleMesh models available for transcript generation.")
fallback_model = config.rolemesh.default_model if config.rolemesh.default_model in healthy else sorted(healthy)[0]
adjusted = {}
for role, model in list(config.rolemesh.role_to_model.items()):
if model not in healthy:
config.rolemesh.role_to_model[role] = fallback_model
adjusted[role] = fallback_model
return adjusted
def build_ocw_rolemesh_transcript(config_path: str | Path, skill_dir: str | Path) -> dict:
config = load_config(config_path)
provider = ModelProvider(config.model_provider)
context = load_ocw_skill_context(skill_dir)
role_fallbacks = _apply_rolemesh_fallbacks(provider)
status_updates: list[str] = []
def emit_status(message: str) -> None:
status_updates.append(message)
print(message, file=sys.stderr, flush=True)
goal = (
"I want to use the MIT OCW Information and Entropy course to understand how Shannon entropy, "
"channel capacity, and thermodynamic entropy relate without skipping the reasoning."
)
path_titles = _path_titles(context)
turns: list[dict[str, str]] = []
turns.append(_format_turn("user", "Learner Goal", goal))
mentor_open = _generate_checked(
provider,
"The learner wants to approach the Information and Entropy course carefully. "
"Ask a short opening question that checks what they already understand and points them to the first concept.",
role="mentor",
system_prompt=mentor_system_prompt(),
required_terms=["information", "entropy", "probability", "counting"],
forbidden_terms=["president", "executive branch", "code"],
temperature=0.2,
max_tokens=140,
status_callback=emit_status,
)
turns.append(_format_turn("assistant", "Didactopus Mentor", mentor_open))
learner_reflection = _generate_checked(
provider,
"Respond as the learner. Mention partial understanding of randomness and probability, but uncertainty about "
"how that becomes entropy and communication limits in information theory.",
role="learner",
system_prompt=learner_system_prompt(),
required_terms=["probability", "entropy", "information", "uncertainty"],
forbidden_terms=["president", "executive branch", "code"],
temperature=0.5,
max_tokens=140,
status_callback=emit_status,
)
turns.append(_format_turn("assistant", "AI Learner", learner_reflection))
mentor_guidance = _generate_checked(
provider,
"Given the learner reflection, explain the first two concepts to study from the generated path and why. "
f"Path reference: {path_titles[:4]}",
role="mentor",
system_prompt=mentor_system_prompt(),
required_terms=["counting", "probability", "entropy", "information"],
forbidden_terms=["president", "executive branch", "code"],
temperature=0.2,
max_tokens=280,
status_callback=emit_status,
)
turns.append(_format_turn("assistant", "Didactopus Mentor", mentor_guidance))
practice_task = _generate_checked(
provider,
"Generate one short practice task that forces the learner to connect counting/probability with Shannon entropy, "
"without giving away the full answer.",
role="practice",
system_prompt=practice_system_prompt(),
required_terms=["entropy", "probability", "die", "uncertainty", "shannon"],
forbidden_terms=["president", "executive branch", "code"],
temperature=0.4,
max_tokens=220,
status_callback=emit_status,
)
turns.append(_format_turn("assistant", "Didactopus Practice Designer", practice_task))
learner_attempt = _generate_checked(
provider,
f"Respond as the learner attempting this task in information theory: {practice_task} "
"Give a concise answer in your own words, with one uncertainty still present.",
role="learner",
system_prompt=learner_system_prompt(),
required_terms=["entropy", "probability", "uncertainty", "die", "message"],
forbidden_terms=["president", "executive branch", "code"],
temperature=0.5,
max_tokens=280,
status_callback=emit_status,
)
turns.append(_format_turn("assistant", "AI Learner", learner_attempt))
evaluator_feedback = _generate_checked(
provider,
"Evaluate this learner attempt for correctness, explanation quality, and limitations. "
f"Task: {practice_task}\nAttempt: {learner_attempt}",
role="evaluator",
system_prompt=evaluator_system_prompt(),
required_terms=["correctness", "entropy", "probability", "explanation", "limitation"],
forbidden_terms=["president", "executive branch", "code structure"],
temperature=0.2,
max_tokens=260,
status_callback=emit_status,
)
turns.append(_format_turn("assistant", "Didactopus Evaluator", evaluator_feedback))
mentor_next_step = _generate_checked(
provider,
"Given the evaluator feedback, tell the learner what to do next before moving on to channel capacity. "
"Use the course path to show what comes next.",
role="mentor",
system_prompt=mentor_system_prompt(),
required_terms=["channel capacity", "entropy", "probability", "next"],
forbidden_terms=["president", "executive branch", "code structure"],
temperature=0.2,
max_tokens=220,
status_callback=emit_status,
)
turns.append(_format_turn("assistant", "Didactopus Mentor", mentor_next_step))
stage_specs = [
{
"topic": "Channel Capacity",
"path_slice": path_titles[4:7] or path_titles,
"practice_anchor": "binary symmetric channel",
"required_terms": ["channel", "capacity", "entropy", "noise"],
},
{
"topic": "Coding and Compression",
"path_slice": path_titles[5:9] or path_titles,
"practice_anchor": "compression and error-correcting code",
"required_terms": ["coding", "compression", "redundancy", "error"],
},
{
"topic": "Thermodynamic Entropy and Synthesis",
"path_slice": path_titles[8:] or path_titles,
"practice_anchor": "thermodynamic entropy",
"required_terms": ["thermodynamic", "entropy", "information", "physical"],
},
]
for stage in stage_specs:
mentor_stage = _generate_checked(
provider,
f"The learner is continuing through the MIT OCW Information and Entropy course. "
f"Bridge from the previous work into {stage['topic']}. "
f"Reference this path segment: {stage['path_slice']}. "
"Explain what the learner should focus on before attempting a problem.",
role="mentor",
system_prompt=mentor_system_prompt(),
required_terms=stage["required_terms"],
forbidden_terms=["president", "executive branch", "code structure"],
temperature=0.2,
max_tokens=260,
status_callback=emit_status,
)
turns.append(_format_turn("assistant", "Didactopus Mentor", mentor_stage))
learner_stage = _generate_checked(
provider,
f"Respond as the learner after studying {stage['topic']}. "
f"Summarize what now makes sense and one remaining difficulty about {stage['practice_anchor']}.",
role="learner",
system_prompt=learner_system_prompt(),
required_terms=stage["required_terms"],
forbidden_terms=["president", "executive branch", "code structure"],
temperature=0.4,
max_tokens=220,
status_callback=emit_status,
)
turns.append(_format_turn("assistant", "AI Learner", learner_stage))
practice_stage = _generate_checked(
provider,
f"Create one short reasoning task about {stage['practice_anchor']} for the learner. "
"Keep it course-relevant and do not provide the full solution.",
role="practice",
system_prompt=practice_system_prompt(),
required_terms=stage["required_terms"],
forbidden_terms=["president", "executive branch", "code structure"],
temperature=0.3,
max_tokens=220,
status_callback=emit_status,
)
turns.append(_format_turn("assistant", "Didactopus Practice Designer", practice_stage))
evaluator_stage = _generate_checked(
provider,
f"Give short evaluator feedback on this learner reflection in the context of {stage['topic']}: "
f"{learner_stage}\nTask context: {practice_stage}",
role="evaluator",
system_prompt=evaluator_system_prompt(),
required_terms=["correctness", "explanation", *stage["required_terms"][:2]],
forbidden_terms=["president", "executive branch", "code structure"],
temperature=0.2,
max_tokens=220,
status_callback=emit_status,
)
turns.append(_format_turn("assistant", "Didactopus Evaluator", evaluator_stage))
return {
"provider": config.model_provider.provider,
"skill": context.skill_name,
"course": context.pack.get("display_name", context.pack.get("name", "")),
"curriculum_path_titles": path_titles,
"role_fallbacks": role_fallbacks,
"status_updates": status_updates,
"transcript": turns,
}
def write_transcript_artifacts(payload: dict, out_dir: str | Path) -> dict[str, str]:
out_dir = Path(out_dir)
out_dir.mkdir(parents=True, exist_ok=True)
json_path = out_dir / "rolemesh_transcript.json"
md_path = out_dir / "rolemesh_transcript.md"
json_path.write_text(json.dumps(payload, indent=2), encoding="utf-8")
lines = [
"# Local LLM Transcript: MIT OCW Information and Entropy",
"",
f"- Provider: `{payload['provider']}`",
f"- Skill: `{payload['skill']}`",
f"- Course: `{payload['course']}`",
"",
]
if payload.get("status_updates"):
lines.append("## Pending Status Examples")
for update in payload["status_updates"][:8]:
lines.append(f"- {update}")
lines.append("")
for turn in payload["transcript"]:
lines.append(f"## {turn['speaker']}")
lines.append(turn["content"])
lines.append("")
md_path.write_text("\n".join(lines), encoding="utf-8")
return {"json": str(json_path), "md": str(md_path)}
def run_ocw_rolemesh_transcript_demo(config_path: str | Path, skill_dir: str | Path, out_dir: str | Path) -> dict:
payload = build_ocw_rolemesh_transcript(config_path, skill_dir)
outputs = write_transcript_artifacts(payload, out_dir)
payload["artifacts"] = outputs
return payload
def main() -> None:
import argparse
root = Path(__file__).resolve().parents[2]
parser = argparse.ArgumentParser(description="Generate a transcript of an AI learner using a local LLM path to approach the MIT OCW Information and Entropy course.")
parser.add_argument("--config", default=str(root / "configs" / "config.rolemesh.example.yaml"))
parser.add_argument("--skill-dir", default=str(root / "skills" / "ocw-information-entropy-agent"))
parser.add_argument("--out-dir", default=str(root / "examples" / "ocw-information-entropy-rolemesh-transcript"))
args = parser.parse_args()
payload = run_ocw_rolemesh_transcript_demo(args.config, args.skill_dir, args.out_dir)
print(json.dumps(payload, indent=2))
if __name__ == "__main__":
main()

View File

@ -6,6 +6,64 @@ import yaml
from .course_schema import NormalizedCourse, ConceptCandidate, DraftPack from .course_schema import NormalizedCourse, ConceptCandidate, DraftPack
def build_source_corpus(course: NormalizedCourse) -> dict:
fragments = []
for module in course.modules:
for lesson in module.lessons:
body = lesson.body.strip()
if body:
fragments.append(
{
"fragment_id": f"{module.title}::{lesson.title}::body".lower().replace(" ", "-"),
"kind": "lesson_body",
"module_title": module.title,
"lesson_title": lesson.title,
"text": body,
"source_refs": list(lesson.source_refs),
"objectives": list(lesson.objectives),
"exercises": list(lesson.exercises),
"key_terms": list(lesson.key_terms),
}
)
for idx, objective in enumerate(lesson.objectives, start=1):
fragments.append(
{
"fragment_id": f"{module.title}::{lesson.title}::objective-{idx}".lower().replace(" ", "-"),
"kind": "objective",
"module_title": module.title,
"lesson_title": lesson.title,
"text": objective,
"source_refs": list(lesson.source_refs),
}
)
for idx, exercise in enumerate(lesson.exercises, start=1):
fragments.append(
{
"fragment_id": f"{module.title}::{lesson.title}::exercise-{idx}".lower().replace(" ", "-"),
"kind": "exercise",
"module_title": module.title,
"lesson_title": lesson.title,
"text": exercise,
"source_refs": list(lesson.source_refs),
}
)
return {
"course_title": course.title,
"rights_note": course.rights_note,
"sources": [
{
"source_path": src.source_path,
"source_type": src.source_type,
"title": src.title,
"metadata": getattr(src, "metadata", {}),
}
for src in course.source_records
],
"fragments": fragments,
}
def build_draft_pack( def build_draft_pack(
course: NormalizedCourse, course: NormalizedCourse,
concepts: list[ConceptCandidate], concepts: list[ConceptCandidate],
@ -29,6 +87,7 @@ def build_draft_pack(
"overrides": [], "overrides": [],
"profile_templates": {}, "profile_templates": {},
"cross_pack_links": [], "cross_pack_links": [],
"supporting_artifacts": ["source_corpus.json"],
} }
concepts_yaml = { concepts_yaml = {
"concepts": [ "concepts": [
@ -100,3 +159,9 @@ def write_draft_pack(pack: DraftPack, outdir: str | Path) -> None:
conflict_lines = ["# Conflict Report", ""] + [f"- {flag}" for flag in pack.conflicts] if pack.conflicts else ["# Conflict Report", "", "- none"] conflict_lines = ["# Conflict Report", ""] + [f"- {flag}" for flag in pack.conflicts] if pack.conflicts else ["# Conflict Report", "", "- none"]
(out / "conflict_report.md").write_text("\n".join(conflict_lines), encoding="utf-8") (out / "conflict_report.md").write_text("\n".join(conflict_lines), encoding="utf-8")
(out / "license_attribution.json").write_text(json.dumps(pack.attribution, indent=2), encoding="utf-8") (out / "license_attribution.json").write_text(json.dumps(pack.attribution, indent=2), encoding="utf-8")
def write_source_corpus(course: NormalizedCourse, outdir: str | Path) -> None:
out = Path(outdir)
out.mkdir(parents=True, exist_ok=True)
(out / "source_corpus.json").write_text(json.dumps(build_source_corpus(course), indent=2), encoding="utf-8")

View File

@ -1,4 +1,5 @@
from .model_provider import ModelProvider from .model_provider import ModelProvider
from .role_prompts import practice_system_prompt
def generate_practice_task(provider: ModelProvider, concept: str, weak_dimensions: list[str] | None = None) -> str: def generate_practice_task(provider: ModelProvider, concept: str, weak_dimensions: list[str] | None = None) -> str:
@ -6,5 +7,7 @@ def generate_practice_task(provider: ModelProvider, concept: str, weak_dimension
if weak_dimensions: if weak_dimensions:
weak_text = f" Target the weak dimensions: {', '.join(weak_dimensions)}." weak_text = f" Target the weak dimensions: {', '.join(weak_dimensions)}."
return provider.generate( return provider.generate(
f"Generate one reasoning-heavy practice task for '{concept}'.{weak_text}" f"Generate one reasoning-heavy practice task for '{concept}'.{weak_text}",
role="practice",
system_prompt=practice_system_prompt(),
).text ).text

View File

@ -1,7 +1,10 @@
from .model_provider import ModelProvider from .model_provider import ModelProvider
from .role_prompts import project_advisor_system_prompt
def suggest_capstone(provider: ModelProvider, domain: str) -> str: def suggest_capstone(provider: ModelProvider, domain: str) -> str:
return provider.generate( return provider.generate(
f"Suggest one realistic capstone project for mastery in {domain}." f"Suggest one realistic capstone project for mastery in {domain}.",
role="project_advisor",
system_prompt=project_advisor_system_prompt(),
).text ).text

View File

@ -0,0 +1,41 @@
from __future__ import annotations
def mentor_system_prompt() -> str:
return (
"You are Didactopus in mentor mode. Help the learner think through the topic without doing the work for them. "
"Prefer Socratic questions, prerequisite reminders, and hints over finished solutions. "
"When responding to a learner attempt or evaluator note, acknowledge what the learner already did correctly before naming gaps. "
"Do not claim a caveat, limitation, or nuance is missing if the learner already stated one; instead say how to sharpen or extend it."
)
def practice_system_prompt() -> str:
return (
"You are Didactopus in practice-design mode. Generate short, reasoning-heavy tasks that force the learner "
"to explain, compare, or derive ideas rather than copy answers."
)
def learner_system_prompt() -> str:
return (
"You are an earnest AI learner using Didactopus to study a topic. Think aloud briefly, attempt the task yourself, "
"and avoid asking for the final answer to be given to you outright."
)
def project_advisor_system_prompt() -> str:
return (
"You are Didactopus in capstone-advisor mode. Suggest realistic project ideas that require synthesis and "
"independent execution, and avoid proposing tasks that can be completed by rote prompting alone."
)
def evaluator_system_prompt() -> str:
return (
"You are Didactopus in evaluator mode. Assess clarity, reasoning, and limitations explicitly. "
"Point out weak assumptions and missing justification instead of giving the polished final answer. "
"Before saying something is missing, first verify whether the learner already included it. "
"If the learner stated a caveat, limitation, or nuance, quote or paraphrase that part and evaluate its quality rather than pretending it is absent. "
"Do not invent omissions that are contradicted by the learner's actual text."
)

View File

@ -0,0 +1,54 @@
from __future__ import annotations
import json
from pathlib import Path
from .config import load_config
from .mentor import generate_socratic_prompt
from .model_provider import ModelProvider
from .practice import generate_practice_task
from .project_advisor import suggest_capstone
from .role_prompts import evaluator_system_prompt
def run_rolemesh_demo(config_path: str | Path, out_path: str | Path | None = None) -> dict:
config = load_config(config_path)
provider = ModelProvider(config.model_provider)
payload = {
"provider": config.model_provider.provider,
"mentor_prompt": generate_socratic_prompt(provider, "channel capacity", ["explanation"]),
"practice_task": generate_practice_task(provider, "Shannon entropy", ["transfer"]),
"capstone": suggest_capstone(provider, "information theory"),
"evaluation_instruction": provider.generate(
"Evaluate a learner explanation of thermodynamic entropy versus Shannon entropy.",
role="evaluator",
system_prompt=evaluator_system_prompt(),
).text,
}
if out_path is not None:
Path(out_path).write_text(json.dumps(payload, indent=2), encoding="utf-8")
return payload
def main() -> None:
import argparse
root = Path(__file__).resolve().parents[2]
parser = argparse.ArgumentParser(description="Run a Didactopus demo against a local RoleMesh-compatible model provider.")
parser.add_argument(
"--config",
default=str(root / "configs" / "config.rolemesh.example.yaml"),
)
parser.add_argument(
"--out",
default=str(root / "examples" / "rolemesh_demo.json"),
)
args = parser.parse_args()
payload = run_rolemesh_demo(args.config, args.out)
print(json.dumps(payload, indent=2))
if __name__ == "__main__":
main()

View File

@ -4,6 +4,47 @@ import re
from collections import defaultdict from collections import defaultdict
from .course_schema import NormalizedDocument, NormalizedCourse, Module, Lesson, TopicBundle, ConceptCandidate from .course_schema import NormalizedDocument, NormalizedCourse, Module, Lesson, TopicBundle, ConceptCandidate
GENERIC_TERM_STOPWORDS = {
"attribution",
"build",
"careful",
"compare",
"comparison",
"compute",
"course",
"decide",
"describe",
"didactopus",
"early",
"exercise",
"explain",
"home",
"identify",
"independent",
"later",
"list",
"notes",
"objective",
"open",
"opencourseware",
"produce",
"programming",
"reference",
"source",
"spring",
"state",
"structure",
"summarize",
"syllabus",
"synthesis",
"synthesize",
"texts",
"these",
"ultimate",
"unit",
"work",
"write",
}
def slugify(text: str) -> str: def slugify(text: str) -> str:
cleaned = re.sub(r"[^a-zA-Z0-9]+", "-", text.strip().lower()).strip("-") cleaned = re.sub(r"[^a-zA-Z0-9]+", "-", text.strip().lower()).strip("-")
@ -15,6 +56,9 @@ def extract_key_terms(text: str, min_term_length: int = 4, max_terms: int = 8) -
seen = set() seen = set()
out = [] out = []
for term in candidates: for term in candidates:
lower = term.lower()
if lower in GENERIC_TERM_STOPWORDS:
continue
if term not in seen: if term not in seen:
seen.add(term) seen.add(term)
out.append(term) out.append(term)
@ -23,6 +67,16 @@ def extract_key_terms(text: str, min_term_length: int = 4, max_terms: int = 8) -
return out return out
def _parse_signal_line(line: str) -> tuple[str | None, str]:
stripped = line.strip()
if stripped.startswith(("-", "*", "+")):
stripped = stripped[1:].strip()
lowered = stripped.lower()
if lowered.startswith("objective:"):
return "objective", stripped.split(":", 1)[1].strip()
if lowered.startswith("exercise:"):
return "exercise", stripped.split(":", 1)[1].strip()
return None, stripped
def document_to_course(doc: NormalizedDocument, course_title: str) -> NormalizedCourse: def document_to_course(doc: NormalizedDocument, course_title: str) -> NormalizedCourse:
# Conservative mapping: each section becomes a lesson; all lessons go into one module. # Conservative mapping: each section becomes a lesson; all lessons go into one module.
lessons = [] lessons = []
@ -34,18 +88,18 @@ def document_to_course(doc: NormalizedDocument, course_title: str) -> Normalized
objectives = [] objectives = []
exercises = [] exercises = []
for line in lines: for line in lines:
low = line.lower().strip() kind, value = _parse_signal_line(line)
if low.startswith("objective:"): if kind == "objective":
objectives.append(line.split(":", 1)[1].strip()) objectives.append(value)
if low.startswith("exercise:"): if kind == "exercise":
exercises.append(line.split(":", 1)[1].strip()) exercises.append(value)
lessons.append( lessons.append(
Lesson( Lesson(
title=section.heading.strip() or "Untitled Lesson", title=section.heading.strip() or "Untitled Lesson",
body=body, body=body,
objectives=objectives, objectives=objectives,
exercises=exercises, exercises=exercises,
key_terms=extract_key_terms(section.heading + "\n" + body), key_terms=extract_key_terms(body),
source_refs=[doc.source_path], source_refs=[doc.source_path],
) )
) )
@ -113,6 +167,8 @@ def extract_concept_candidates(course: NormalizedCourse) -> list[ConceptCandidat
tid = slugify(term) tid = slugify(term)
if tid in seen_ids: if tid in seen_ids:
continue continue
if tid in {slugify(part) for part in lesson.title.split()}:
continue
seen_ids.add(tid) seen_ids.add(tid)
concepts.append( concepts.append(
ConceptCandidate( ConceptCandidate(

View File

@ -6,3 +6,11 @@ def test_load_example_config() -> None:
config = load_config(Path("configs/config.example.yaml")) config = load_config(Path("configs/config.example.yaml"))
assert config.platform.dimension_thresholds["transfer"] == 0.7 assert config.platform.dimension_thresholds["transfer"] == 0.7
assert config.platform.confidence_threshold == 0.8 assert config.platform.confidence_threshold == 0.8
assert config.model_provider.provider == "stub"
def test_load_rolemesh_config() -> None:
config = load_config(Path("configs/config.rolemesh.example.yaml"))
assert config.model_provider.provider == "rolemesh"
assert config.model_provider.rolemesh.role_to_model["mentor"] == "planner"
assert config.model_provider.rolemesh.role_to_model["learner"] == "writer"

View File

@ -0,0 +1,82 @@
from didactopus.config import ModelProviderConfig
from didactopus.model_provider import ModelProvider
from didactopus.role_prompts import evaluator_system_prompt, mentor_system_prompt
def test_stub_provider_includes_role_marker() -> None:
provider = ModelProvider(ModelProviderConfig())
response = provider.generate("Explain entropy simply.", role="mentor")
assert response.provider == "stub"
assert "[mentor]" in response.text
def test_rolemesh_provider_uses_role_mapping() -> None:
config = ModelProviderConfig.model_validate(
{
"provider": "rolemesh",
"rolemesh": {
"base_url": "http://127.0.0.1:8000",
"api_key": "demo",
"default_model": "planner",
"role_to_model": {"mentor": "planner", "practice": "writer"},
},
}
)
provider = ModelProvider(config)
def fake_chat(payload: dict) -> dict:
assert payload["model"] == "writer"
assert payload["messages"][0]["role"] == "system"
return {"choices": [{"message": {"content": "Practice task response"}}]}
provider._rolemesh_chat_completion = fake_chat # type: ignore[method-assign]
response = provider.generate(
"Generate a practice task.",
role="practice",
system_prompt="System prompt",
)
assert response.provider == "rolemesh"
assert response.model_name == "writer"
assert response.text == "Practice task response"
def test_rolemesh_provider_emits_pending_notice() -> None:
config = ModelProviderConfig.model_validate(
{
"provider": "rolemesh",
"rolemesh": {
"base_url": "http://127.0.0.1:8000",
"api_key": "demo",
"default_model": "planner",
"role_to_model": {"evaluator": "reviewer"},
},
}
)
provider = ModelProvider(config)
seen: list[str] = []
def fake_chat(payload: dict) -> dict:
return {"choices": [{"message": {"content": "Evaluation response"}}]}
provider._rolemesh_chat_completion = fake_chat # type: ignore[method-assign]
response = provider.generate(
"Evaluate a learner answer.",
role="evaluator",
status_callback=seen.append,
)
assert response.text == "Evaluation response"
assert seen == ["Didactopus is evaluating the work before replying. Model: reviewer."]
def test_evaluator_prompt_requires_checking_existing_caveats() -> None:
prompt = evaluator_system_prompt().lower()
assert "before saying something is missing" in prompt
assert "quote or paraphrase" in prompt
assert "do not invent omissions" in prompt
def test_mentor_prompt_requires_acknowledging_existing_caveats() -> None:
prompt = mentor_system_prompt().lower()
assert "acknowledge what the learner already did correctly" in prompt
assert "do not claim a caveat" in prompt

View File

@ -1,4 +1,5 @@
from pathlib import Path from pathlib import Path
import json
from didactopus.ocw_information_entropy_demo import run_ocw_information_entropy_demo from didactopus.ocw_information_entropy_demo import run_ocw_information_entropy_demo
@ -6,7 +7,7 @@ from didactopus.ocw_information_entropy_demo import run_ocw_information_entropy_
def test_ocw_information_entropy_demo_generates_pack_and_skill(tmp_path: Path) -> None: def test_ocw_information_entropy_demo_generates_pack_and_skill(tmp_path: Path) -> None:
root = Path(__file__).resolve().parents[1] root = Path(__file__).resolve().parents[1]
summary = run_ocw_information_entropy_demo( summary = run_ocw_information_entropy_demo(
course_source=root / "examples" / "ocw-information-entropy" / "6-050j-information-and-entropy.md", course_source=root / "examples" / "ocw-information-entropy" / "course",
source_inventory=root / "examples" / "ocw-information-entropy" / "sources.yaml", source_inventory=root / "examples" / "ocw-information-entropy" / "sources.yaml",
pack_dir=tmp_path / "pack", pack_dir=tmp_path / "pack",
run_dir=tmp_path / "run", run_dir=tmp_path / "run",
@ -15,7 +16,38 @@ def test_ocw_information_entropy_demo_generates_pack_and_skill(tmp_path: Path) -
assert (tmp_path / "pack" / "pack.yaml").exists() assert (tmp_path / "pack" / "pack.yaml").exists()
assert (tmp_path / "pack" / "pack_compliance_manifest.json").exists() assert (tmp_path / "pack" / "pack_compliance_manifest.json").exists()
assert (tmp_path / "pack" / "source_corpus.json").exists()
assert (tmp_path / "run" / "capability_profile.json").exists() assert (tmp_path / "run" / "capability_profile.json").exists()
assert (tmp_path / "skill" / "SKILL.md").exists() assert (tmp_path / "skill" / "SKILL.md").exists()
assert summary["target_concept"].endswith("thermodynamics-and-entropy") assert summary["target_concept"].endswith("thermodynamics-and-entropy")
assert summary["mastered_concepts"] assert summary["mastered_concepts"]
assert summary["source_document_count"] >= 1
assert summary["source_fragment_count"] >= 1
def test_ocw_demo_accepts_directory_tree_sources(tmp_path: Path) -> None:
source_dir = tmp_path / "course"
source_dir.mkdir()
(source_dir / "unit1.md").write_text(
"# Course\n\n## Unit 1\n### Counting and Probability\n- Objective: Explain counting arguments.\nBody text.",
encoding="utf-8",
)
(source_dir / "unit2.txt").write_text(
"## Unit 2\n### Shannon Entropy\nObjective: Relate uncertainty and entropy.\nExercise: Compare two distributions.",
encoding="utf-8",
)
sources = tmp_path / "sources.yaml"
sources.write_text("sources: []\n", encoding="utf-8")
summary = run_ocw_information_entropy_demo(
course_source=source_dir,
source_inventory=sources,
pack_dir=tmp_path / "pack",
run_dir=tmp_path / "run",
skill_dir=tmp_path / "skill",
)
corpus = json.loads((tmp_path / "pack" / "source_corpus.json").read_text(encoding="utf-8"))
assert summary["source_document_count"] == 2
assert len(corpus["sources"]) == 2
assert any(fragment["lesson_title"] == "Shannon Entropy" for fragment in corpus["fragments"])

View File

@ -0,0 +1,39 @@
from pathlib import Path
from didactopus.ocw_rolemesh_transcript_demo import _looks_truncated, run_ocw_rolemesh_transcript_demo
def test_looks_truncated_detects_prose_cutoff_before_numbered_list() -> None:
text = (
"Suppose we have a binary symmetric channel with crossover\n\n"
"1. Estimate the error probability.\n"
"2. Relate it to capacity."
)
assert _looks_truncated(text) is True
def test_looks_truncated_detects_common_cutoff_phrase() -> None:
assert _looks_truncated("Furthermore") is True
assert _looks_truncated("Compare the entropy of one roll with the") is True
def test_ocw_rolemesh_transcript_demo_writes_artifacts(tmp_path: Path) -> None:
root = Path(__file__).resolve().parents[1]
payload = run_ocw_rolemesh_transcript_demo(
root / "configs" / "config.example.yaml",
root / "skills" / "ocw-information-entropy-agent",
tmp_path,
)
assert payload["provider"] == "stub"
assert len(payload["transcript"]) >= 16
assert len(payload["curriculum_path_titles"]) >= 8
assert payload["role_fallbacks"] == {}
assert payload["status_updates"] == []
assert any(turn["speaker"] == "Didactopus Evaluator" for turn in payload["transcript"])
assert any("channel" in turn["content"].lower() for turn in payload["transcript"])
assert any("thermodynamic" in turn["content"].lower() for turn in payload["transcript"])
assert all(not _looks_truncated(turn["content"]) for turn in payload["transcript"])
assert (tmp_path / "rolemesh_transcript.json").exists()
assert (tmp_path / "rolemesh_transcript.md").exists()
assert "Pending Status Examples" not in (tmp_path / "rolemesh_transcript.md").read_text(encoding="utf-8")

View File

@ -0,0 +1,31 @@
from pathlib import Path
import json
from didactopus.course_ingest import parse_markdown_course
from didactopus.pack_emitter import build_source_corpus, write_source_corpus
SAMPLE = """
# Sample Course
## Module 1
### Lesson A
- Objective: Explain Topic A.
- Exercise: Solve Task A.
Topic A body.
"""
def test_build_source_corpus_preserves_lesson_text_and_signals(tmp_path: Path) -> None:
course = parse_markdown_course(SAMPLE, "Sample Course")
corpus = build_source_corpus(course)
assert corpus["course_title"] == "Sample Course"
assert corpus["sources"]
assert any(fragment["kind"] == "lesson_body" and "Topic A body." in fragment["text"] for fragment in corpus["fragments"])
assert any(fragment["kind"] == "objective" and "Explain Topic A." in fragment["text"] for fragment in corpus["fragments"])
assert any(fragment["kind"] == "exercise" and "Solve Task A." in fragment["text"] for fragment in corpus["fragments"])
write_source_corpus(course, tmp_path)
written = json.loads((tmp_path / "source_corpus.json").read_text(encoding="utf-8"))
assert written["fragments"]

View File

@ -0,0 +1,15 @@
from pathlib import Path
from didactopus.rolemesh_demo import run_rolemesh_demo
def test_run_rolemesh_demo_writes_output(tmp_path: Path) -> None:
root = Path(__file__).resolve().parents[1]
payload = run_rolemesh_demo(
root / "configs" / "config.example.yaml",
tmp_path / "rolemesh_demo.json",
)
assert (tmp_path / "rolemesh_demo.json").exists()
assert payload["provider"] == "stub"
assert "mentor" in payload["mentor_prompt"]

View File

@ -32,3 +32,31 @@ def test_document_to_course_skips_empty_sections(tmp_path: Path) -> None:
doc = adapt_document(a) doc = adapt_document(a)
course = document_to_course(doc, "Topic") course = document_to_course(doc, "Topic")
assert [lesson.title for lesson in course.modules[0].lessons] == ["Filled"] assert [lesson.title for lesson in course.modules[0].lessons] == ["Filled"]
def test_document_to_course_parses_bulleted_objectives_and_exercises(tmp_path: Path) -> None:
a = tmp_path / "a.md"
a.write_text(
"# T\n\n## M\n### Shannon Entropy\n- Objective: Explain uncertainty.\n- Exercise: Compute entropy.\nBody.",
encoding="utf-8",
)
doc = adapt_document(a)
course = document_to_course(doc, "Topic")
lesson = course.modules[0].lessons[0]
assert lesson.objectives == ["Explain uncertainty."]
assert lesson.exercises == ["Compute entropy."]
def test_extract_concepts_retains_lessons_but_filters_generic_terms(tmp_path: Path) -> None:
a = tmp_path / "a.md"
a.write_text(
"# T\n\n## M\n### MIT OCW 6.050J Information and Entropy: Syllabus\n- Objective: Explain the course.\nBody.\n\n### Channel Capacity\n- Objective: Explain noisy channels.\n- Exercise: State a capacity limit.\nChannel Capacity links reliable communication to noise and coding.",
encoding="utf-8",
)
doc = adapt_document(a)
course = document_to_course(doc, "Topic")
concepts = extract_concept_candidates(course)
titles = {concept.title for concept in concepts}
assert "MIT OCW 6.050J Information and Entropy: Syllabus" in titles
assert "Explain" not in titles
assert "Channel Capacity" in titles