Revised docs for quick start

This commit is contained in:
welsberr 2026-03-14 21:48:31 -04:00
parent 886b7ac418
commit 4c9b4f52dd
30 changed files with 2100 additions and 0 deletions

127
README.md
View File

@ -12,6 +12,133 @@ At a high level, the repository does five things:
4. Build merged learning graphs, rank next concepts, accumulate learner evidence, and export capability profiles. 4. Build merged learning graphs, rank next concepts, accumulate learner evidence, and export capability profiles.
5. Demonstrate end-to-end flows, including an MIT OCW Information and Entropy demo that produces a pack, learner outputs, a reusable skill bundle, and progress visualizations. 5. Demonstrate end-to-end flows, including an MIT OCW Information and Entropy demo that produces a pack, learner outputs, a reusable skill bundle, and progress visualizations.
## Start Here If You Just Want To Learn
If you only want the shortest path to "show me Didactopus helping someone learn," run:
```bash
pip install -e .
python -m didactopus.ocw_information_entropy_demo
python -m didactopus.ocw_progress_viz
python -m didactopus.ocw_skill_agent_demo
```
Then open:
- `examples/ocw-information-entropy-run/learner_progress.html`
- `examples/ocw-information-entropy-skill-demo/skill_demo.md`
- `skills/ocw-information-entropy-agent/`
That gives you:
- a generated topic pack
- a visible learning path
- progress artifacts
- a reusable skill grounded in the exported knowledge
If that is your use case, read the next section, `Fast Start For Impatient Autodidacts`, and skip the deeper architecture sections until you need them.
## Fast Start For Impatient Autodidacts
If your real question is "How quickly can I get this to help me learn something?", use one of these paths.
### Fastest path: use the included MIT OCW demo
This is the shortest route to seeing the whole system work as a personal mentor scaffold.
1. Install the repo:
```bash
pip install -e .
```
2. Generate the demo pack, learner outputs, and reusable skill:
```bash
python -m didactopus.ocw_information_entropy_demo
```
3. Render the learner progress views:
```bash
python -m didactopus.ocw_progress_viz
python -m didactopus.ocw_progress_viz --full-map
```
4. Run the "agent uses the learned skill" demo:
```bash
python -m didactopus.ocw_skill_agent_demo
```
After that, inspect:
- `examples/ocw-information-entropy-run/`
- `examples/ocw-information-entropy-skill-demo/`
- `skills/ocw-information-entropy-agent/`
What you get:
- a domain pack for the topic
- a guided curriculum path
- a synthetic learner run over that path
- a capability export
- a reusable skill bundle
- visual progress artifacts
This is the best "show me why this is fun" path in the current repo.
### Fast custom path: turn one markdown file into a draft learning domain
If you already have notes, a syllabus, or a course outline, the lightest custom workflow is:
1. Put the material in a Markdown or text file.
2. Adapt and ingest it through the course/topic pipeline.
3. Emit a draft pack.
4. Review only what matters.
The easiest reference for this flow is the OCW demo source:
- `examples/ocw-information-entropy/6-050j-information-and-entropy.md`
Use it as a template for your own topic, then follow the same pattern implemented in:
- `didactopus.ocw_information_entropy_demo`
### If you want a mentor more than a curation tool
Treat Didactopus as a loop:
1. Start from one topic you genuinely care about.
2. Generate a draft pack quickly, even if it is imperfect.
3. Keep only the concepts and progression that feel useful.
4. Use the resulting pack and skill outputs to drive explanations, study plans, and self-checks.
The important idea is not "perfect ingestion first." It is "usable learning structure fast enough that you keep going."
### Current friction honestly stated
The lowest-friction path is the included demo. The custom path still asks you to be comfortable with:
- running Python commands locally
- editing or preparing a source file
- accepting heuristic extraction noise
- reviewing draft outputs before trusting them
Didactopus is already good at reducing the activation energy from "pile of source material" to "coherent learning structure," but it is not yet a one-click end-user tutor product.
### Why use it anyway?
Because it can make learning feel more like building a visible map of mastery than passively consuming material.
Instead of only reading notes, you can get:
- a concept graph
- a staged path
- explicit prerequisites
- evidence-aware progress artifacts
- reusable skill outputs for future tutoring or evaluation
## What Is In This Repository ## What Is In This Repository
- `src/didactopus/` - `src/didactopus/`

View File

@ -8,6 +8,43 @@ Didactopus turns educational material into structured learning packs, then uses
It is a workbench-style repository with runnable code, tests, example packs, generated outputs, and local-first review/demo flows. It is a workbench-style repository with runnable code, tests, example packs, generated outputs, and local-first review/demo flows.
## I am one person trying to learn a topic. What is the fastest useful way to use this?
Use the included MIT OCW Information and Entropy demo first.
Run:
```bash
pip install -e .
python -m didactopus.ocw_information_entropy_demo
python -m didactopus.ocw_progress_viz
python -m didactopus.ocw_skill_agent_demo
```
That gives you, with minimal setup:
- a generated topic pack
- a guided curriculum path
- a learner progress view
- a capability export
- a reusable skill bundle
- a demo of an agentic system using that skill
If you only want to see whether Didactopus feels useful as a personal mentor scaffold, this is the right place to start.
## What is the fastest custom route for a single learner?
Start from one Markdown or text file for a topic you care about.
The lightest custom pattern is:
1. Prepare a single source file with lesson headings, short descriptions, objectives, and exercises.
2. Use the OCW demo source in `examples/ocw-information-entropy/` as the model.
3. Adapt the same pipeline shape used by `didactopus.ocw_information_entropy_demo`.
4. Review the resulting draft pack just enough to remove obvious noise.
The current system is best when used as "generate a usable map quickly, then refine only what matters."
## What is a domain pack? ## What is a domain pack?
A domain pack is the unit Didactopus uses to represent a learning domain. In practice it is a directory containing: A domain pack is the unit Didactopus uses to represent a learning domain. In practice it is a directory containing:
@ -43,6 +80,19 @@ Yes, but conservatively. Those adapters currently normalize text in a simplified
No. The current agentic learner paths are deterministic and synthetic. They are meant to exercise the orchestration pattern, evaluator pipeline, mastery updates, capability export, and visualization flow without requiring an external model service. No. The current agentic learner paths are deterministic and synthetic. They are meant to exercise the orchestration pattern, evaluator pipeline, mastery updates, capability export, and visualization flow without requiring an external model service.
## Can I still use it as a personal mentor even though the learner is synthetic?
Yes, if you think of the current repo as a structured learning workbench rather than a chat product.
Right now the value is in:
- turning source material into a concept/path structure
- making prerequisites explicit
- exporting progress and capability artifacts
- generating reusable skill context for future tutoring or evaluation
The current demos show the shape of a mentor workflow even though the agent itself is not yet a live external model integration.
## What is the current evidence model? ## What is the current evidence model?
The evidence engine supports: The evidence engine supports:
@ -80,6 +130,7 @@ generates:
- a new pack in `domain-packs/mit-ocw-information-entropy/` - a new pack in `domain-packs/mit-ocw-information-entropy/`
- learner outputs in `examples/ocw-information-entropy-run/` - learner outputs in `examples/ocw-information-entropy-run/`
- a repo-local skill bundle in `skills/ocw-information-entropy-agent/` - a repo-local skill bundle in `skills/ocw-information-entropy-agent/`
- an agentic skill-usage demo in `examples/ocw-information-entropy-skill-demo/`
## What visualizations exist today? ## What visualizations exist today?
@ -95,6 +146,16 @@ python -m didactopus.ocw_progress_viz
python -m didactopus.ocw_progress_viz --full-map python -m didactopus.ocw_progress_viz --full-map
``` ```
## What should I expect to review manually?
Today, usually:
- noisy concept candidates from extraction
- weak or missing mastery signals
- any prerequisite ordering that feels too thin or too rigid
The fastest productive workflow is not to perfect everything. It is to prune obvious noise and keep moving.
## Is the generated content free of extractor noise? ## Is the generated content free of extractor noise?
No. The current extractors can still emit noisy candidate concepts, especially from title-cased phrases embedded in lesson text. That is why review flags, workspace review, and promotion flows are first-class parts of the project. No. The current extractors can still emit noisy candidate concepts, especially from title-cased phrases embedded in lesson text. That is why review flags, workspace review, and promotion flows are first-class parts of the project.

View File

@ -0,0 +1,61 @@
{
"learner_id": "ocw-information-entropy-agent",
"domain": "MIT OCW Information and Entropy",
"artifacts": [
{
"concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy",
"artifact_type": "symbolic",
"artifact_name": "mit-ocw-6-050j-information-and-entropy.md"
},
{
"concept": "mit-ocw-information-and-entropy::counting-and-probability",
"artifact_type": "symbolic",
"artifact_name": "counting-and-probability.md"
},
{
"concept": "mit-ocw-information-and-entropy::shannon-entropy",
"artifact_type": "symbolic",
"artifact_name": "shannon-entropy.md"
},
{
"concept": "mit-ocw-information-and-entropy::mutual-information",
"artifact_type": "symbolic",
"artifact_name": "mutual-information.md"
},
{
"concept": "mit-ocw-information-and-entropy::data-compression",
"artifact_type": "symbolic",
"artifact_name": "data-compression.md"
},
{
"concept": "mit-ocw-information-and-entropy::huffman-coding",
"artifact_type": "symbolic",
"artifact_name": "huffman-coding.md"
},
{
"concept": "mit-ocw-information-and-entropy::channel-capacity",
"artifact_type": "symbolic",
"artifact_name": "channel-capacity.md"
},
{
"concept": "mit-ocw-information-and-entropy::channel-coding",
"artifact_type": "symbolic",
"artifact_name": "channel-coding.md"
},
{
"concept": "mit-ocw-information-and-entropy::error-correcting-codes",
"artifact_type": "symbolic",
"artifact_name": "error-correcting-codes.md"
},
{
"concept": "mit-ocw-information-and-entropy::cryptography-and-information-hiding",
"artifact_type": "symbolic",
"artifact_name": "cryptography-and-information-hiding.md"
},
{
"concept": "mit-ocw-information-and-entropy::thermodynamics-and-entropy",
"artifact_type": "symbolic",
"artifact_name": "thermodynamics-and-entropy.md"
}
]
}

View File

@ -0,0 +1,145 @@
{
"learner_id": "ocw-information-entropy-agent",
"display_name": "OCW Information Entropy Agent",
"domain": "MIT OCW Information and Entropy",
"mastered_concepts": [
"mit-ocw-information-and-entropy::channel-capacity",
"mit-ocw-information-and-entropy::channel-coding",
"mit-ocw-information-and-entropy::counting-and-probability",
"mit-ocw-information-and-entropy::cryptography-and-information-hiding",
"mit-ocw-information-and-entropy::data-compression",
"mit-ocw-information-and-entropy::error-correcting-codes",
"mit-ocw-information-and-entropy::huffman-coding",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy",
"mit-ocw-information-and-entropy::mutual-information",
"mit-ocw-information-and-entropy::shannon-entropy",
"mit-ocw-information-and-entropy::thermodynamics-and-entropy"
],
"weak_dimensions_by_concept": {
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy": [],
"mit-ocw-information-and-entropy::counting-and-probability": [],
"mit-ocw-information-and-entropy::shannon-entropy": [],
"mit-ocw-information-and-entropy::mutual-information": [],
"mit-ocw-information-and-entropy::data-compression": [],
"mit-ocw-information-and-entropy::huffman-coding": [],
"mit-ocw-information-and-entropy::channel-capacity": [],
"mit-ocw-information-and-entropy::channel-coding": [],
"mit-ocw-information-and-entropy::error-correcting-codes": [],
"mit-ocw-information-and-entropy::cryptography-and-information-hiding": [],
"mit-ocw-information-and-entropy::thermodynamics-and-entropy": []
},
"evaluator_summary_by_concept": {
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::counting-and-probability": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::shannon-entropy": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::mutual-information": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::data-compression": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::huffman-coding": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::channel-capacity": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::channel-coding": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::error-correcting-codes": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::cryptography-and-information-hiding": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::thermodynamics-and-entropy": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
}
},
"artifacts": [
{
"concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy",
"artifact_type": "symbolic",
"artifact_name": "mit-ocw-6-050j-information-and-entropy.md"
},
{
"concept": "mit-ocw-information-and-entropy::counting-and-probability",
"artifact_type": "symbolic",
"artifact_name": "counting-and-probability.md"
},
{
"concept": "mit-ocw-information-and-entropy::shannon-entropy",
"artifact_type": "symbolic",
"artifact_name": "shannon-entropy.md"
},
{
"concept": "mit-ocw-information-and-entropy::mutual-information",
"artifact_type": "symbolic",
"artifact_name": "mutual-information.md"
},
{
"concept": "mit-ocw-information-and-entropy::data-compression",
"artifact_type": "symbolic",
"artifact_name": "data-compression.md"
},
{
"concept": "mit-ocw-information-and-entropy::huffman-coding",
"artifact_type": "symbolic",
"artifact_name": "huffman-coding.md"
},
{
"concept": "mit-ocw-information-and-entropy::channel-capacity",
"artifact_type": "symbolic",
"artifact_name": "channel-capacity.md"
},
{
"concept": "mit-ocw-information-and-entropy::channel-coding",
"artifact_type": "symbolic",
"artifact_name": "channel-coding.md"
},
{
"concept": "mit-ocw-information-and-entropy::error-correcting-codes",
"artifact_type": "symbolic",
"artifact_name": "error-correcting-codes.md"
},
{
"concept": "mit-ocw-information-and-entropy::cryptography-and-information-hiding",
"artifact_type": "symbolic",
"artifact_name": "cryptography-and-information-hiding.md"
},
{
"concept": "mit-ocw-information-and-entropy::thermodynamics-and-entropy",
"artifact_type": "symbolic",
"artifact_name": "thermodynamics-and-entropy.md"
}
]
}

View File

@ -0,0 +1,97 @@
# Capability Profile: OCW Information Entropy Agent
- Learner ID: `ocw-information-entropy-agent`
- Domain: `MIT OCW Information and Entropy`
## Mastered Concepts
- mit-ocw-information-and-entropy::channel-capacity
- mit-ocw-information-and-entropy::channel-coding
- mit-ocw-information-and-entropy::counting-and-probability
- mit-ocw-information-and-entropy::cryptography-and-information-hiding
- mit-ocw-information-and-entropy::data-compression
- mit-ocw-information-and-entropy::error-correcting-codes
- mit-ocw-information-and-entropy::huffman-coding
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy
- mit-ocw-information-and-entropy::mutual-information
- mit-ocw-information-and-entropy::shannon-entropy
- mit-ocw-information-and-entropy::thermodynamics-and-entropy
## Concept Summaries
### mit-ocw-information-and-entropy::channel-capacity
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::channel-coding
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::counting-and-probability
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::cryptography-and-information-hiding
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::data-compression
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::error-correcting-codes
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::huffman-coding
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::mutual-information
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::shannon-entropy
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::thermodynamics-and-entropy
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
## Artifacts
- mit-ocw-6-050j-information-and-entropy.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy
- counting-and-probability.md (symbolic) for mit-ocw-information-and-entropy::counting-and-probability
- shannon-entropy.md (symbolic) for mit-ocw-information-and-entropy::shannon-entropy
- mutual-information.md (symbolic) for mit-ocw-information-and-entropy::mutual-information
- data-compression.md (symbolic) for mit-ocw-information-and-entropy::data-compression
- huffman-coding.md (symbolic) for mit-ocw-information-and-entropy::huffman-coding
- channel-capacity.md (symbolic) for mit-ocw-information-and-entropy::channel-capacity
- channel-coding.md (symbolic) for mit-ocw-information-and-entropy::channel-coding
- error-correcting-codes.md (symbolic) for mit-ocw-information-and-entropy::error-correcting-codes
- cryptography-and-information-hiding.md (symbolic) for mit-ocw-information-and-entropy::cryptography-and-information-hiding
- thermodynamics-and-entropy.md (symbolic) for mit-ocw-information-and-entropy::thermodynamics-and-entropy

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 6.9 KiB

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 13 KiB

View File

@ -0,0 +1,93 @@
{
"course_source": "/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy/6-050j-information-and-entropy.md",
"pack_dir": "/home/netuser/dev/Didactopustry1/domain-packs/mit-ocw-information-entropy",
"skill_dir": "/home/netuser/dev/Didactopustry1/skills/ocw-information-entropy-agent",
"review_flags": [
"Module 'Imported from MARKDOWN' has no explicit exercises; mastery signals may be weak.",
"Concept 'MIT OCW 6.050J Information and Entropy' has no extracted mastery signals; review manually.",
"Concept 'Information' has no extracted mastery signals; review manually.",
"Concept 'Entropy' has no extracted mastery signals; review manually.",
"Concept 'Source' has no extracted mastery signals; review manually.",
"Concept 'OpenCourseWare' has no extracted mastery signals; review manually.",
"Concept 'Spring' has no extracted mastery signals; review manually.",
"Concept 'Attribution' has no extracted mastery signals; review manually.",
"Concept 'Counting and Probability' has no extracted mastery signals; review manually.",
"Concept 'Counting' has no extracted mastery signals; review manually.",
"Concept 'Probability' has no extracted mastery signals; review manually.",
"Concept 'Objective' has no extracted mastery signals; review manually.",
"Concept 'Explain' has no extracted mastery signals; review manually.",
"Concept 'Exercise' has no extracted mastery signals; review manually.",
"Concept 'Derive' has no extracted mastery signals; review manually.",
"Concept 'This' has no extracted mastery signals; review manually.",
"Concept 'Random' has no extracted mastery signals; review manually.",
"Concept 'Shannon Entropy' has no extracted mastery signals; review manually.",
"Concept 'Shannon' has no extracted mastery signals; review manually.",
"Concept 'Compute' has no extracted mastery signals; review manually.",
"Concept 'Bernoulli' has no extracted mastery signals; review manually.",
"Concept 'Mutual Information' has no extracted mastery signals; review manually.",
"Concept 'Mutual' has no extracted mastery signals; review manually.",
"Concept 'Compare' has no extracted mastery signals; review manually.",
"Concept 'Dependence' has no extracted mastery signals; review manually.",
"Concept 'Data Compression' has no extracted mastery signals; review manually.",
"Concept 'Data' has no extracted mastery signals; review manually.",
"Concept 'Compression' has no extracted mastery signals; review manually.",
"Concept 'Describe' has no extracted mastery signals; review manually.",
"Concept 'Redundancy' has no extracted mastery signals; review manually.",
"Concept 'Huffman Coding' has no extracted mastery signals; review manually.",
"Concept 'Huffman' has no extracted mastery signals; review manually.",
"Concept 'Coding' has no extracted mastery signals; review manually.",
"Concept 'Build' has no extracted mastery signals; review manually.",
"Concept 'Prefix' has no extracted mastery signals; review manually.",
"Concept 'Channel Capacity' has no extracted mastery signals; review manually.",
"Concept 'Channel' has no extracted mastery signals; review manually.",
"Concept 'Capacity' has no extracted mastery signals; review manually.",
"Concept 'State' has no extracted mastery signals; review manually.",
"Concept 'Reliable' has no extracted mastery signals; review manually.",
"Concept 'Channel Coding' has no extracted mastery signals; review manually.",
"Concept 'Contrast' has no extracted mastery signals; review manually.",
"Concept 'Decoding' has no extracted mastery signals; review manually.",
"Concept 'Error Correcting Codes' has no extracted mastery signals; review manually.",
"Concept 'Error' has no extracted mastery signals; review manually.",
"Concept 'Correcting' has no extracted mastery signals; review manually.",
"Concept 'Codes' has no extracted mastery signals; review manually.",
"Concept 'Cryptography and Information Hiding' has no extracted mastery signals; review manually.",
"Concept 'Cryptography' has no extracted mastery signals; review manually.",
"Concept 'Hiding' has no extracted mastery signals; review manually.",
"Concept 'Thermodynamics and Entropy' has no extracted mastery signals; review manually.",
"Concept 'Thermodynamics' has no extracted mastery signals; review manually.",
"Concept 'Course Synthesis' has no extracted mastery signals; review manually.",
"Concept 'Course' has no extracted mastery signals; review manually.",
"Concept 'Synthesis' has no extracted mastery signals; review manually.",
"Concept 'Synthesize' has no extracted mastery signals; review manually.",
"Concept 'Produce' has no extracted mastery signals; review manually."
],
"concept_count": 56,
"target_concept": "mit-ocw-information-and-entropy::thermodynamics-and-entropy",
"curriculum_path": [
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy",
"mit-ocw-information-and-entropy::counting-and-probability",
"mit-ocw-information-and-entropy::shannon-entropy",
"mit-ocw-information-and-entropy::mutual-information",
"mit-ocw-information-and-entropy::data-compression",
"mit-ocw-information-and-entropy::huffman-coding",
"mit-ocw-information-and-entropy::channel-capacity",
"mit-ocw-information-and-entropy::channel-coding",
"mit-ocw-information-and-entropy::error-correcting-codes",
"mit-ocw-information-and-entropy::cryptography-and-information-hiding",
"mit-ocw-information-and-entropy::thermodynamics-and-entropy"
],
"mastered_concepts": [
"mit-ocw-information-and-entropy::channel-capacity",
"mit-ocw-information-and-entropy::channel-coding",
"mit-ocw-information-and-entropy::counting-and-probability",
"mit-ocw-information-and-entropy::cryptography-and-information-hiding",
"mit-ocw-information-and-entropy::data-compression",
"mit-ocw-information-and-entropy::error-correcting-codes",
"mit-ocw-information-and-entropy::huffman-coding",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy",
"mit-ocw-information-and-entropy::mutual-information",
"mit-ocw-information-and-entropy::shannon-entropy",
"mit-ocw-information-and-entropy::thermodynamics-and-entropy"
],
"artifact_count": 11
}

View File

@ -0,0 +1,97 @@
{
"skill": {
"name": "ocw-information-entropy-agent",
"description": "Use the generated MIT OCW Information and Entropy pack, concept ordering, and learner artifacts to mentor or evaluate information-theory work."
},
"study_plan": {
"skill": "ocw-information-entropy-agent",
"task": "Help a learner connect Shannon entropy, channel capacity, and thermodynamic entropy.",
"steps": [
{
"concept_key": "mit-ocw-information-and-entropy::thermodynamics-and-entropy",
"title": "Thermodynamics and Entropy",
"status": "mastered",
"prerequisites": [
"mit-ocw-information-and-entropy::cryptography-and-information-hiding"
],
"recommended_action": "Use Thermodynamics and Entropy as the primary teaching anchor."
},
{
"concept_key": "mit-ocw-information-and-entropy::course-synthesis",
"title": "Course Synthesis",
"status": "review-needed",
"prerequisites": [
"mit-ocw-information-and-entropy::thermodynamics-and-entropy"
],
"recommended_action": "Review prerequisites before teaching Course Synthesis."
},
{
"concept_key": "mit-ocw-information-and-entropy::shannon-entropy",
"title": "Shannon Entropy",
"status": "mastered",
"prerequisites": [
"mit-ocw-information-and-entropy::counting-and-probability"
],
"recommended_action": "Use Shannon Entropy as the primary teaching anchor."
}
],
"guided_path_reference": [
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy",
"mit-ocw-information-and-entropy::counting-and-probability",
"mit-ocw-information-and-entropy::shannon-entropy",
"mit-ocw-information-and-entropy::mutual-information",
"mit-ocw-information-and-entropy::data-compression",
"mit-ocw-information-and-entropy::huffman-coding",
"mit-ocw-information-and-entropy::channel-capacity",
"mit-ocw-information-and-entropy::channel-coding",
"mit-ocw-information-and-entropy::error-correcting-codes",
"mit-ocw-information-and-entropy::cryptography-and-information-hiding",
"mit-ocw-information-and-entropy::thermodynamics-and-entropy"
]
},
"explanation": {
"concept_key": "mit-ocw-information-and-entropy::channel-capacity",
"title": "Channel Capacity",
"explanation": "Channel Capacity is represented in the Information and Entropy skill as part of a progression from foundational probability ideas toward communication limits and physical interpretation. It depends on huffman-coding. The current demo learner already mastered this concept, with evaluator means {'correctness': 0.8400000000000001, 'explanation': 0.85, 'critique': 0.7999999999999999}, so the skill can use it as a stable explanation anchor.",
"source_description": "- Objective: Explain Channel Capacity as a limit on reliable communication over noisy channels.\n- Exercise: State why reliable transmission above capacity is impossible in the long run.\nThis lesson develops Channel Capacity, Reliable Commun"
},
"evaluation": {
"concept_key": "mit-ocw-information-and-entropy::thermodynamics-and-entropy",
"submission": "Therefore entropy = uncertainty in a message model, but one limitation is that thermodynamic entropy and Shannon entropy are not identical without careful interpretation.",
"verdict": "acceptable",
"aggregated": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.6499999999999999
},
"evaluators": [
{
"name": "rubric",
"dimensions": {
"correctness": 0.8,
"explanation": 0.85
},
"notes": "Heuristic scaffold rubric score."
},
{
"name": "symbolic_rule",
"dimensions": {
"correctness": 0.88
},
"notes": "Stub symbolic evaluator."
},
{
"name": "critique",
"dimensions": {
"critique": 0.6499999999999999
},
"notes": "Stub critique evaluator."
}
],
"skill_reference": {
"skill_name": "ocw-information-entropy-agent",
"mastered_by_demo_agent": true
},
"follow_up": "Extend the answer with an explicit limitation or assumption."
}
}

View File

@ -0,0 +1,17 @@
# OCW Information and Entropy Skill Demo
- Skill: `ocw-information-entropy-agent`
- Description: Use the generated MIT OCW Information and Entropy pack, concept ordering, and learner artifacts to mentor or evaluate information-theory work.
## Study Plan
- Thermodynamics and Entropy (mastered): Use Thermodynamics and Entropy as the primary teaching anchor.
- Course Synthesis (review-needed): Review prerequisites before teaching Course Synthesis.
- Shannon Entropy (mastered): Use Shannon Entropy as the primary teaching anchor.
## Explanation Demo
Channel Capacity is represented in the Information and Entropy skill as part of a progression from foundational probability ideas toward communication limits and physical interpretation. It depends on huffman-coding. The current demo learner already mastered this concept, with evaluator means {'correctness': 0.8400000000000001, 'explanation': 0.85, 'critique': 0.7999999999999999}, so the skill can use it as a stable explanation anchor.
## Evaluation Demo
- Verdict: acceptable
- Aggregated dimensions: {'correctness': 0.8400000000000001, 'explanation': 0.85, 'critique': 0.6499999999999999}
- Follow-up: Extend the answer with an explicit limitation or assumption.

View File

@ -0,0 +1,67 @@
# MIT OCW 6.050J Information and Entropy
Source: MIT OpenCourseWare 6.050J Information and Entropy, Spring 2008.
Attribution: adapted from the OCW course overview, unit sequence, and assigned textbook references.
## Foundations of Information Theory
### Counting and Probability
- Objective: Explain how counting arguments, probability spaces, and random variables support later information-theory results.
- Exercise: Derive a simple counting argument for binary strings and compute an event probability.
This lesson introduces Counting, Probability, Random Variables, and Combinatorics as the shared language for the rest of the course. The learner should connect these basics to uncertainty, messages, and evidence.
### Shannon Entropy
- Objective: Explain Shannon Entropy as a measure of uncertainty and compare high-entropy and low-entropy sources.
- Exercise: Compute the entropy of a Bernoulli source and interpret the result.
This lesson centers Shannon Entropy, Surprise, and Source Models. The learner should describe why entropy matters because it bounds efficient description length and clarifies uncertainty.
### Mutual Information
- Objective: Explain Mutual Information and relate it to dependence between signals.
- Exercise: Compare independent variables with dependent variables using mutual-information reasoning.
This lesson introduces Mutual Information, Dependence, and Observations. The learner should explain how information gain changes when observations reduce uncertainty.
## Compression and Source Coding
### Data Compression
- Objective: Explain lossless compression in terms of entropy and typical structure.
- Exercise: Describe when compression succeeds and when it fails on already-random data.
This lesson covers Data Compression, Redundancy, and Efficient Representation. The learner should connect entropy limits to coding choices.
### Huffman Coding
- Objective: Explain Huffman Coding and justify why shorter codewords should track more likely symbols.
- Exercise: Build a Huffman code for a small source alphabet.
This lesson focuses on Huffman Coding, Prefix Codes, and Expected Length. The learner should explain the tradeoff between probability, tree structure, and average code length.
### Channel Capacity
- Objective: Explain Channel Capacity as a limit on reliable communication over noisy channels.
- Exercise: State why reliable transmission above capacity is impossible in the long run.
This lesson develops Channel Capacity, Reliable Communication, and Noise. The learner should explain why capacity matters because it defines the ceiling for dependable transmission.
## Communication Under Noise
### Channel Coding
- Objective: Explain how Channel Coding adds structure that protects messages against noise.
- Exercise: Contrast uncoded transmission with coded transmission on a noisy channel.
This lesson connects Channel Coding, Decoding, and Reliability. The learner should explain the role of redundancy and inference in successful communication.
### Error Correcting Codes
- Objective: Explain how Error Correcting Codes detect or correct symbol corruption.
- Exercise: Describe a simple parity-style code and its limits.
This lesson covers Error Correcting Codes, Parity, and Syndrome-style reasoning. The learner should discuss strengths, failure modes, and decoding assumptions.
## Applications and Synthesis
### Cryptography and Information Hiding
- Objective: Explain the relationship between secrecy, information leakage, and coded communication.
- Exercise: Compare a secure scheme with a weak one in terms of revealed information.
This lesson combines Cryptography, Information Leakage, and Adversarial Observation. The learner should explain secrecy as controlled information flow rather than only obscurity.
### Thermodynamics and Entropy
- Objective: Explain how thermodynamic entropy relates to, and differs from, Shannon entropy.
- Exercise: Compare the two entropy notions and identify what is preserved across the analogy.
This lesson connects Thermodynamics, Entropy, and Physical Interpretation. The learner should explain the analogy carefully because the shared mathematics does not erase domain differences.
### Course Synthesis
- Objective: Synthesize the course by connecting entropy, coding, reliability, and physical interpretation in one coherent narrative.
- Exercise: Produce a final study guide that links source coding, channel coding, secrecy, and thermodynamic analogies.
This lesson integrates Source Coding, Channel Coding, Cryptography, and Thermodynamic Analogy. The learner should show how the course forms one unified model of uncertainty, representation, and communication.

View File

@ -0,0 +1,23 @@
---
name: ocw-information-entropy-agent
description: Use the generated MIT OCW Information and Entropy pack, concept ordering, and learner artifacts to mentor or evaluate information-theory work.
---
# OCW Information Entropy Agent
Use this skill when the task is about tutoring, evaluating, or planning study in Information Theory using the generated MIT OCW 6.050J pack.
## Workflow
1. Read `references/generated-course-summary.md` for the pack structure and target concepts.
2. Read `references/generated-capability-summary.md` to understand what the demo AI learner already mastered.
3. Use `assets/generated/pack/` as the source of truth for concept ids, prerequisites, and mastery signals.
4. When giving guidance, preserve the pack ordering from fundamentals through coding and thermodynamics.
5. When uncertain, say which concept or prerequisite in the generated pack is underspecified.
## Outputs
- study plans grounded in the pack prerequisites
- concept explanations tied to entropy, coding, and channel capacity
- evaluation checklists using the generated capability report
- follow-up exercises that extend the existing learner artifacts

View File

@ -0,0 +1,3 @@
display_name: OCW Information Entropy Agent
short_description: Tutor and assess with the generated MIT OCW information-theory pack.
default_prompt: Help me use the MIT OCW information-and-entropy pack to study or evaluate work.

View File

@ -0,0 +1,420 @@
concepts:
- id: mit-ocw-6-050j-information-and-entropy
title: MIT OCW 6.050J Information and Entropy
description: 'Source: MIT OpenCourseWare 6.050J Information and Entropy, Spring
2008.
Attribution: adapted from the OCW course overview, unit sequence, and assigned
textbook references.'
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: information
title: Information
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information
and Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: entropy
title: Entropy
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information
and Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: source
title: Source
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information
and Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: opencourseware
title: OpenCourseWare
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information
and Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: spring
title: Spring
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information
and Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: attribution
title: Attribution
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information
and Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: counting-and-probability
title: Counting and Probability
description: '- Objective: Explain how counting arguments, probability spaces, and
random variables support later information-theory results.
- Exercise: Derive a simple counting argument for binary strings and compute an
event probability.
This lesson i'
prerequisites:
- mit-ocw-6-050j-information-and-entropy
mastery_signals: []
mastery_profile: {}
- id: counting
title: Counting
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: probability
title: Probability
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: objective
title: Objective
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: explain
title: Explain
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: exercise
title: Exercise
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: derive
title: Derive
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: this
title: This
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: random
title: Random
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: shannon-entropy
title: Shannon Entropy
description: '- Objective: Explain Shannon Entropy as a measure of uncertainty and
compare high-entropy and low-entropy sources.
- Exercise: Compute the entropy of a Bernoulli source and interpret the result.
This lesson centers Shannon Entropy, Surprise'
prerequisites:
- counting-and-probability
mastery_signals: []
mastery_profile: {}
- id: shannon
title: Shannon
description: Candidate concept extracted from lesson 'Shannon Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: compute
title: Compute
description: Candidate concept extracted from lesson 'Shannon Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: bernoulli
title: Bernoulli
description: Candidate concept extracted from lesson 'Shannon Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: mutual-information
title: Mutual Information
description: '- Objective: Explain Mutual Information and relate it to dependence
between signals.
- Exercise: Compare independent variables with dependent variables using mutual-information
reasoning.
This lesson introduces Mutual Information, Dependenc'
prerequisites:
- shannon-entropy
mastery_signals: []
mastery_profile: {}
- id: mutual
title: Mutual
description: Candidate concept extracted from lesson 'Mutual Information'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: compare
title: Compare
description: Candidate concept extracted from lesson 'Mutual Information'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: dependence
title: Dependence
description: Candidate concept extracted from lesson 'Mutual Information'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: data-compression
title: Data Compression
description: '- Objective: Explain lossless compression in terms of entropy and
typical structure.
- Exercise: Describe when compression succeeds and when it fails on already-random
data.
This lesson covers Data Compression, Redundancy, and Efficient Rep'
prerequisites:
- mutual-information
mastery_signals: []
mastery_profile: {}
- id: data
title: Data
description: Candidate concept extracted from lesson 'Data Compression'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: compression
title: Compression
description: Candidate concept extracted from lesson 'Data Compression'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: describe
title: Describe
description: Candidate concept extracted from lesson 'Data Compression'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: redundancy
title: Redundancy
description: Candidate concept extracted from lesson 'Data Compression'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: huffman-coding
title: Huffman Coding
description: '- Objective: Explain Huffman Coding and justify why shorter codewords
should track more likely symbols.
- Exercise: Build a Huffman code for a small source alphabet.
This lesson focuses on Huffman Coding, Prefix Codes, and Expected Length.'
prerequisites:
- data-compression
mastery_signals: []
mastery_profile: {}
- id: huffman
title: Huffman
description: Candidate concept extracted from lesson 'Huffman Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: coding
title: Coding
description: Candidate concept extracted from lesson 'Huffman Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: build
title: Build
description: Candidate concept extracted from lesson 'Huffman Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: prefix
title: Prefix
description: Candidate concept extracted from lesson 'Huffman Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: channel-capacity
title: Channel Capacity
description: '- Objective: Explain Channel Capacity as a limit on reliable communication
over noisy channels.
- Exercise: State why reliable transmission above capacity is impossible in the
long run.
This lesson develops Channel Capacity, Reliable Commun'
prerequisites:
- huffman-coding
mastery_signals: []
mastery_profile: {}
- id: channel
title: Channel
description: Candidate concept extracted from lesson 'Channel Capacity'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: capacity
title: Capacity
description: Candidate concept extracted from lesson 'Channel Capacity'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: state
title: State
description: Candidate concept extracted from lesson 'Channel Capacity'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: reliable
title: Reliable
description: Candidate concept extracted from lesson 'Channel Capacity'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: channel-coding
title: Channel Coding
description: '- Objective: Explain how Channel Coding adds structure that protects
messages against noise.
- Exercise: Contrast uncoded transmission with coded transmission on a noisy channel.
This lesson connects Channel Coding, Decoding, and Reliabilit'
prerequisites:
- channel-capacity
mastery_signals: []
mastery_profile: {}
- id: contrast
title: Contrast
description: Candidate concept extracted from lesson 'Channel Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: decoding
title: Decoding
description: Candidate concept extracted from lesson 'Channel Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: error-correcting-codes
title: Error Correcting Codes
description: '- Objective: Explain how Error Correcting Codes detect or correct
symbol corruption.
- Exercise: Describe a simple parity-style code and its limits.
This lesson covers Error Correcting Codes, Parity, and Syndrome-style reasoning.
The learne'
prerequisites:
- channel-coding
mastery_signals: []
mastery_profile: {}
- id: error
title: Error
description: Candidate concept extracted from lesson 'Error Correcting Codes'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: correcting
title: Correcting
description: Candidate concept extracted from lesson 'Error Correcting Codes'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: codes
title: Codes
description: Candidate concept extracted from lesson 'Error Correcting Codes'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: cryptography-and-information-hiding
title: Cryptography and Information Hiding
description: '- Objective: Explain the relationship between secrecy, information
leakage, and coded communication.
- Exercise: Compare a secure scheme with a weak one in terms of revealed information.
This lesson combines Cryptography, Information Leakag'
prerequisites:
- error-correcting-codes
mastery_signals: []
mastery_profile: {}
- id: cryptography
title: Cryptography
description: Candidate concept extracted from lesson 'Cryptography and Information
Hiding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: hiding
title: Hiding
description: Candidate concept extracted from lesson 'Cryptography and Information
Hiding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: thermodynamics-and-entropy
title: Thermodynamics and Entropy
description: '- Objective: Explain how thermodynamic entropy relates to, and differs
from, Shannon entropy.
- Exercise: Compare the two entropy notions and identify what is preserved across
the analogy.
This lesson connects Thermodynamics, Entropy, and P'
prerequisites:
- cryptography-and-information-hiding
mastery_signals: []
mastery_profile: {}
- id: thermodynamics
title: Thermodynamics
description: Candidate concept extracted from lesson 'Thermodynamics and Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: course-synthesis
title: Course Synthesis
description: '- Objective: Synthesize the course by connecting entropy, coding,
reliability, and physical interpretation in one coherent narrative.
- Exercise: Produce a final study guide that links source coding, channel coding,
secrecy, and thermodynam'
prerequisites:
- thermodynamics-and-entropy
mastery_signals: []
mastery_profile: {}
- id: course
title: Course
description: Candidate concept extracted from lesson 'Course Synthesis'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: synthesis
title: Synthesis
description: Candidate concept extracted from lesson 'Course Synthesis'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: synthesize
title: Synthesize
description: Candidate concept extracted from lesson 'Course Synthesis'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: produce
title: Produce
description: Candidate concept extracted from lesson 'Course Synthesis'.
prerequisites: []
mastery_signals: []
mastery_profile: {}

View File

@ -0,0 +1,3 @@
# Conflict Report
- none

View File

@ -0,0 +1,10 @@
{
"rights_note": "Derived from MIT OpenCourseWare 6.050J Information and Entropy (Spring 2008). Retain MIT OCW attribution and applicable Creative Commons terms before redistribution.",
"sources": [
{
"source_path": "/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy/6-050j-information-and-entropy.md",
"source_type": "markdown",
"title": "6 050J Information And Entropy"
}
]
}

View File

@ -0,0 +1,14 @@
name: mit-ocw-information-and-entropy
display_name: MIT OCW Information and Entropy
version: 0.1.0-draft
schema_version: '1'
didactopus_min_version: 0.1.0
didactopus_max_version: 0.9.99
description: Draft topic pack generated from multi-course inputs for 'MIT OCW Information
and Entropy'.
author: MIT OCW derived demo
license: CC BY-NC-SA 4.0
dependencies: []
overrides: []
profile_templates: {}
cross_pack_links: []

View File

@ -0,0 +1 @@
projects: []

View File

@ -0,0 +1,59 @@
# Review Report
- Module 'Imported from MARKDOWN' has no explicit exercises; mastery signals may be weak.
- Concept 'MIT OCW 6.050J Information and Entropy' has no extracted mastery signals; review manually.
- Concept 'Information' has no extracted mastery signals; review manually.
- Concept 'Entropy' has no extracted mastery signals; review manually.
- Concept 'Source' has no extracted mastery signals; review manually.
- Concept 'OpenCourseWare' has no extracted mastery signals; review manually.
- Concept 'Spring' has no extracted mastery signals; review manually.
- Concept 'Attribution' has no extracted mastery signals; review manually.
- Concept 'Counting and Probability' has no extracted mastery signals; review manually.
- Concept 'Counting' has no extracted mastery signals; review manually.
- Concept 'Probability' has no extracted mastery signals; review manually.
- Concept 'Objective' has no extracted mastery signals; review manually.
- Concept 'Explain' has no extracted mastery signals; review manually.
- Concept 'Exercise' has no extracted mastery signals; review manually.
- Concept 'Derive' has no extracted mastery signals; review manually.
- Concept 'This' has no extracted mastery signals; review manually.
- Concept 'Random' has no extracted mastery signals; review manually.
- Concept 'Shannon Entropy' has no extracted mastery signals; review manually.
- Concept 'Shannon' has no extracted mastery signals; review manually.
- Concept 'Compute' has no extracted mastery signals; review manually.
- Concept 'Bernoulli' has no extracted mastery signals; review manually.
- Concept 'Mutual Information' has no extracted mastery signals; review manually.
- Concept 'Mutual' has no extracted mastery signals; review manually.
- Concept 'Compare' has no extracted mastery signals; review manually.
- Concept 'Dependence' has no extracted mastery signals; review manually.
- Concept 'Data Compression' has no extracted mastery signals; review manually.
- Concept 'Data' has no extracted mastery signals; review manually.
- Concept 'Compression' has no extracted mastery signals; review manually.
- Concept 'Describe' has no extracted mastery signals; review manually.
- Concept 'Redundancy' has no extracted mastery signals; review manually.
- Concept 'Huffman Coding' has no extracted mastery signals; review manually.
- Concept 'Huffman' has no extracted mastery signals; review manually.
- Concept 'Coding' has no extracted mastery signals; review manually.
- Concept 'Build' has no extracted mastery signals; review manually.
- Concept 'Prefix' has no extracted mastery signals; review manually.
- Concept 'Channel Capacity' has no extracted mastery signals; review manually.
- Concept 'Channel' has no extracted mastery signals; review manually.
- Concept 'Capacity' has no extracted mastery signals; review manually.
- Concept 'State' has no extracted mastery signals; review manually.
- Concept 'Reliable' has no extracted mastery signals; review manually.
- Concept 'Channel Coding' has no extracted mastery signals; review manually.
- Concept 'Contrast' has no extracted mastery signals; review manually.
- Concept 'Decoding' has no extracted mastery signals; review manually.
- Concept 'Error Correcting Codes' has no extracted mastery signals; review manually.
- Concept 'Error' has no extracted mastery signals; review manually.
- Concept 'Correcting' has no extracted mastery signals; review manually.
- Concept 'Codes' has no extracted mastery signals; review manually.
- Concept 'Cryptography and Information Hiding' has no extracted mastery signals; review manually.
- Concept 'Cryptography' has no extracted mastery signals; review manually.
- Concept 'Hiding' has no extracted mastery signals; review manually.
- Concept 'Thermodynamics and Entropy' has no extracted mastery signals; review manually.
- Concept 'Thermodynamics' has no extracted mastery signals; review manually.
- Concept 'Course Synthesis' has no extracted mastery signals; review manually.
- Concept 'Course' has no extracted mastery signals; review manually.
- Concept 'Synthesis' has no extracted mastery signals; review manually.
- Concept 'Synthesize' has no extracted mastery signals; review manually.
- Concept 'Produce' has no extracted mastery signals; review manually.

View File

@ -0,0 +1,17 @@
stages:
- id: stage-1
title: Imported from MARKDOWN
concepts:
- mit-ocw-6-050j-information-and-entropy
- counting-and-probability
- shannon-entropy
- mutual-information
- data-compression
- huffman-coding
- channel-capacity
- channel-coding
- error-correcting-codes
- cryptography-and-information-hiding
- thermodynamics-and-entropy
- course-synthesis
checkpoint: []

View File

@ -0,0 +1,6 @@
rubrics:
- id: draft-rubric
title: Draft Rubric
criteria:
- correctness
- explanation

View File

@ -0,0 +1,61 @@
{
"learner_id": "ocw-information-entropy-agent",
"domain": "MIT OCW Information and Entropy",
"artifacts": [
{
"concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy",
"artifact_type": "symbolic",
"artifact_name": "mit-ocw-6-050j-information-and-entropy.md"
},
{
"concept": "mit-ocw-information-and-entropy::counting-and-probability",
"artifact_type": "symbolic",
"artifact_name": "counting-and-probability.md"
},
{
"concept": "mit-ocw-information-and-entropy::shannon-entropy",
"artifact_type": "symbolic",
"artifact_name": "shannon-entropy.md"
},
{
"concept": "mit-ocw-information-and-entropy::mutual-information",
"artifact_type": "symbolic",
"artifact_name": "mutual-information.md"
},
{
"concept": "mit-ocw-information-and-entropy::data-compression",
"artifact_type": "symbolic",
"artifact_name": "data-compression.md"
},
{
"concept": "mit-ocw-information-and-entropy::huffman-coding",
"artifact_type": "symbolic",
"artifact_name": "huffman-coding.md"
},
{
"concept": "mit-ocw-information-and-entropy::channel-capacity",
"artifact_type": "symbolic",
"artifact_name": "channel-capacity.md"
},
{
"concept": "mit-ocw-information-and-entropy::channel-coding",
"artifact_type": "symbolic",
"artifact_name": "channel-coding.md"
},
{
"concept": "mit-ocw-information-and-entropy::error-correcting-codes",
"artifact_type": "symbolic",
"artifact_name": "error-correcting-codes.md"
},
{
"concept": "mit-ocw-information-and-entropy::cryptography-and-information-hiding",
"artifact_type": "symbolic",
"artifact_name": "cryptography-and-information-hiding.md"
},
{
"concept": "mit-ocw-information-and-entropy::thermodynamics-and-entropy",
"artifact_type": "symbolic",
"artifact_name": "thermodynamics-and-entropy.md"
}
]
}

View File

@ -0,0 +1,145 @@
{
"learner_id": "ocw-information-entropy-agent",
"display_name": "OCW Information Entropy Agent",
"domain": "MIT OCW Information and Entropy",
"mastered_concepts": [
"mit-ocw-information-and-entropy::channel-capacity",
"mit-ocw-information-and-entropy::channel-coding",
"mit-ocw-information-and-entropy::counting-and-probability",
"mit-ocw-information-and-entropy::cryptography-and-information-hiding",
"mit-ocw-information-and-entropy::data-compression",
"mit-ocw-information-and-entropy::error-correcting-codes",
"mit-ocw-information-and-entropy::huffman-coding",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy",
"mit-ocw-information-and-entropy::mutual-information",
"mit-ocw-information-and-entropy::shannon-entropy",
"mit-ocw-information-and-entropy::thermodynamics-and-entropy"
],
"weak_dimensions_by_concept": {
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy": [],
"mit-ocw-information-and-entropy::counting-and-probability": [],
"mit-ocw-information-and-entropy::shannon-entropy": [],
"mit-ocw-information-and-entropy::mutual-information": [],
"mit-ocw-information-and-entropy::data-compression": [],
"mit-ocw-information-and-entropy::huffman-coding": [],
"mit-ocw-information-and-entropy::channel-capacity": [],
"mit-ocw-information-and-entropy::channel-coding": [],
"mit-ocw-information-and-entropy::error-correcting-codes": [],
"mit-ocw-information-and-entropy::cryptography-and-information-hiding": [],
"mit-ocw-information-and-entropy::thermodynamics-and-entropy": []
},
"evaluator_summary_by_concept": {
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::counting-and-probability": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::shannon-entropy": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::mutual-information": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::data-compression": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::huffman-coding": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::channel-capacity": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::channel-coding": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::error-correcting-codes": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::cryptography-and-information-hiding": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
},
"mit-ocw-information-and-entropy::thermodynamics-and-entropy": {
"correctness": 0.8400000000000001,
"explanation": 0.85,
"critique": 0.7999999999999999
}
},
"artifacts": [
{
"concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy",
"artifact_type": "symbolic",
"artifact_name": "mit-ocw-6-050j-information-and-entropy.md"
},
{
"concept": "mit-ocw-information-and-entropy::counting-and-probability",
"artifact_type": "symbolic",
"artifact_name": "counting-and-probability.md"
},
{
"concept": "mit-ocw-information-and-entropy::shannon-entropy",
"artifact_type": "symbolic",
"artifact_name": "shannon-entropy.md"
},
{
"concept": "mit-ocw-information-and-entropy::mutual-information",
"artifact_type": "symbolic",
"artifact_name": "mutual-information.md"
},
{
"concept": "mit-ocw-information-and-entropy::data-compression",
"artifact_type": "symbolic",
"artifact_name": "data-compression.md"
},
{
"concept": "mit-ocw-information-and-entropy::huffman-coding",
"artifact_type": "symbolic",
"artifact_name": "huffman-coding.md"
},
{
"concept": "mit-ocw-information-and-entropy::channel-capacity",
"artifact_type": "symbolic",
"artifact_name": "channel-capacity.md"
},
{
"concept": "mit-ocw-information-and-entropy::channel-coding",
"artifact_type": "symbolic",
"artifact_name": "channel-coding.md"
},
{
"concept": "mit-ocw-information-and-entropy::error-correcting-codes",
"artifact_type": "symbolic",
"artifact_name": "error-correcting-codes.md"
},
{
"concept": "mit-ocw-information-and-entropy::cryptography-and-information-hiding",
"artifact_type": "symbolic",
"artifact_name": "cryptography-and-information-hiding.md"
},
{
"concept": "mit-ocw-information-and-entropy::thermodynamics-and-entropy",
"artifact_type": "symbolic",
"artifact_name": "thermodynamics-and-entropy.md"
}
]
}

View File

@ -0,0 +1,97 @@
# Capability Profile: OCW Information Entropy Agent
- Learner ID: `ocw-information-entropy-agent`
- Domain: `MIT OCW Information and Entropy`
## Mastered Concepts
- mit-ocw-information-and-entropy::channel-capacity
- mit-ocw-information-and-entropy::channel-coding
- mit-ocw-information-and-entropy::counting-and-probability
- mit-ocw-information-and-entropy::cryptography-and-information-hiding
- mit-ocw-information-and-entropy::data-compression
- mit-ocw-information-and-entropy::error-correcting-codes
- mit-ocw-information-and-entropy::huffman-coding
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy
- mit-ocw-information-and-entropy::mutual-information
- mit-ocw-information-and-entropy::shannon-entropy
- mit-ocw-information-and-entropy::thermodynamics-and-entropy
## Concept Summaries
### mit-ocw-information-and-entropy::channel-capacity
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::channel-coding
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::counting-and-probability
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::cryptography-and-information-hiding
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::data-compression
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::error-correcting-codes
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::huffman-coding
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::mutual-information
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::shannon-entropy
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::thermodynamics-and-entropy
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
## Artifacts
- mit-ocw-6-050j-information-and-entropy.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy
- counting-and-probability.md (symbolic) for mit-ocw-information-and-entropy::counting-and-probability
- shannon-entropy.md (symbolic) for mit-ocw-information-and-entropy::shannon-entropy
- mutual-information.md (symbolic) for mit-ocw-information-and-entropy::mutual-information
- data-compression.md (symbolic) for mit-ocw-information-and-entropy::data-compression
- huffman-coding.md (symbolic) for mit-ocw-information-and-entropy::huffman-coding
- channel-capacity.md (symbolic) for mit-ocw-information-and-entropy::channel-capacity
- channel-coding.md (symbolic) for mit-ocw-information-and-entropy::channel-coding
- error-correcting-codes.md (symbolic) for mit-ocw-information-and-entropy::error-correcting-codes
- cryptography-and-information-hiding.md (symbolic) for mit-ocw-information-and-entropy::cryptography-and-information-hiding
- thermodynamics-and-entropy.md (symbolic) for mit-ocw-information-and-entropy::thermodynamics-and-entropy

View File

@ -0,0 +1,93 @@
{
"course_source": "/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy/6-050j-information-and-entropy.md",
"pack_dir": "/home/netuser/dev/Didactopustry1/domain-packs/mit-ocw-information-entropy",
"skill_dir": "/home/netuser/dev/Didactopustry1/skills/ocw-information-entropy-agent",
"review_flags": [
"Module 'Imported from MARKDOWN' has no explicit exercises; mastery signals may be weak.",
"Concept 'MIT OCW 6.050J Information and Entropy' has no extracted mastery signals; review manually.",
"Concept 'Information' has no extracted mastery signals; review manually.",
"Concept 'Entropy' has no extracted mastery signals; review manually.",
"Concept 'Source' has no extracted mastery signals; review manually.",
"Concept 'OpenCourseWare' has no extracted mastery signals; review manually.",
"Concept 'Spring' has no extracted mastery signals; review manually.",
"Concept 'Attribution' has no extracted mastery signals; review manually.",
"Concept 'Counting and Probability' has no extracted mastery signals; review manually.",
"Concept 'Counting' has no extracted mastery signals; review manually.",
"Concept 'Probability' has no extracted mastery signals; review manually.",
"Concept 'Objective' has no extracted mastery signals; review manually.",
"Concept 'Explain' has no extracted mastery signals; review manually.",
"Concept 'Exercise' has no extracted mastery signals; review manually.",
"Concept 'Derive' has no extracted mastery signals; review manually.",
"Concept 'This' has no extracted mastery signals; review manually.",
"Concept 'Random' has no extracted mastery signals; review manually.",
"Concept 'Shannon Entropy' has no extracted mastery signals; review manually.",
"Concept 'Shannon' has no extracted mastery signals; review manually.",
"Concept 'Compute' has no extracted mastery signals; review manually.",
"Concept 'Bernoulli' has no extracted mastery signals; review manually.",
"Concept 'Mutual Information' has no extracted mastery signals; review manually.",
"Concept 'Mutual' has no extracted mastery signals; review manually.",
"Concept 'Compare' has no extracted mastery signals; review manually.",
"Concept 'Dependence' has no extracted mastery signals; review manually.",
"Concept 'Data Compression' has no extracted mastery signals; review manually.",
"Concept 'Data' has no extracted mastery signals; review manually.",
"Concept 'Compression' has no extracted mastery signals; review manually.",
"Concept 'Describe' has no extracted mastery signals; review manually.",
"Concept 'Redundancy' has no extracted mastery signals; review manually.",
"Concept 'Huffman Coding' has no extracted mastery signals; review manually.",
"Concept 'Huffman' has no extracted mastery signals; review manually.",
"Concept 'Coding' has no extracted mastery signals; review manually.",
"Concept 'Build' has no extracted mastery signals; review manually.",
"Concept 'Prefix' has no extracted mastery signals; review manually.",
"Concept 'Channel Capacity' has no extracted mastery signals; review manually.",
"Concept 'Channel' has no extracted mastery signals; review manually.",
"Concept 'Capacity' has no extracted mastery signals; review manually.",
"Concept 'State' has no extracted mastery signals; review manually.",
"Concept 'Reliable' has no extracted mastery signals; review manually.",
"Concept 'Channel Coding' has no extracted mastery signals; review manually.",
"Concept 'Contrast' has no extracted mastery signals; review manually.",
"Concept 'Decoding' has no extracted mastery signals; review manually.",
"Concept 'Error Correcting Codes' has no extracted mastery signals; review manually.",
"Concept 'Error' has no extracted mastery signals; review manually.",
"Concept 'Correcting' has no extracted mastery signals; review manually.",
"Concept 'Codes' has no extracted mastery signals; review manually.",
"Concept 'Cryptography and Information Hiding' has no extracted mastery signals; review manually.",
"Concept 'Cryptography' has no extracted mastery signals; review manually.",
"Concept 'Hiding' has no extracted mastery signals; review manually.",
"Concept 'Thermodynamics and Entropy' has no extracted mastery signals; review manually.",
"Concept 'Thermodynamics' has no extracted mastery signals; review manually.",
"Concept 'Course Synthesis' has no extracted mastery signals; review manually.",
"Concept 'Course' has no extracted mastery signals; review manually.",
"Concept 'Synthesis' has no extracted mastery signals; review manually.",
"Concept 'Synthesize' has no extracted mastery signals; review manually.",
"Concept 'Produce' has no extracted mastery signals; review manually."
],
"concept_count": 56,
"target_concept": "mit-ocw-information-and-entropy::thermodynamics-and-entropy",
"curriculum_path": [
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy",
"mit-ocw-information-and-entropy::counting-and-probability",
"mit-ocw-information-and-entropy::shannon-entropy",
"mit-ocw-information-and-entropy::mutual-information",
"mit-ocw-information-and-entropy::data-compression",
"mit-ocw-information-and-entropy::huffman-coding",
"mit-ocw-information-and-entropy::channel-capacity",
"mit-ocw-information-and-entropy::channel-coding",
"mit-ocw-information-and-entropy::error-correcting-codes",
"mit-ocw-information-and-entropy::cryptography-and-information-hiding",
"mit-ocw-information-and-entropy::thermodynamics-and-entropy"
],
"mastered_concepts": [
"mit-ocw-information-and-entropy::channel-capacity",
"mit-ocw-information-and-entropy::channel-coding",
"mit-ocw-information-and-entropy::counting-and-probability",
"mit-ocw-information-and-entropy::cryptography-and-information-hiding",
"mit-ocw-information-and-entropy::data-compression",
"mit-ocw-information-and-entropy::error-correcting-codes",
"mit-ocw-information-and-entropy::huffman-coding",
"mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy",
"mit-ocw-information-and-entropy::mutual-information",
"mit-ocw-information-and-entropy::shannon-entropy",
"mit-ocw-information-and-entropy::thermodynamics-and-entropy"
],
"artifact_count": 11
}

View File

@ -0,0 +1,97 @@
# Capability Profile: OCW Information Entropy Agent
- Learner ID: `ocw-information-entropy-agent`
- Domain: `MIT OCW Information and Entropy`
## Mastered Concepts
- mit-ocw-information-and-entropy::channel-capacity
- mit-ocw-information-and-entropy::channel-coding
- mit-ocw-information-and-entropy::counting-and-probability
- mit-ocw-information-and-entropy::cryptography-and-information-hiding
- mit-ocw-information-and-entropy::data-compression
- mit-ocw-information-and-entropy::error-correcting-codes
- mit-ocw-information-and-entropy::huffman-coding
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy
- mit-ocw-information-and-entropy::mutual-information
- mit-ocw-information-and-entropy::shannon-entropy
- mit-ocw-information-and-entropy::thermodynamics-and-entropy
## Concept Summaries
### mit-ocw-information-and-entropy::channel-capacity
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::channel-coding
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::counting-and-probability
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::cryptography-and-information-hiding
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::data-compression
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::error-correcting-codes
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::huffman-coding
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::mutual-information
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::shannon-entropy
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
### mit-ocw-information-and-entropy::thermodynamics-and-entropy
- correctness: 0.84
- critique: 0.80
- explanation: 0.85
- weak dimensions: none
## Artifacts
- mit-ocw-6-050j-information-and-entropy.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy
- counting-and-probability.md (symbolic) for mit-ocw-information-and-entropy::counting-and-probability
- shannon-entropy.md (symbolic) for mit-ocw-information-and-entropy::shannon-entropy
- mutual-information.md (symbolic) for mit-ocw-information-and-entropy::mutual-information
- data-compression.md (symbolic) for mit-ocw-information-and-entropy::data-compression
- huffman-coding.md (symbolic) for mit-ocw-information-and-entropy::huffman-coding
- channel-capacity.md (symbolic) for mit-ocw-information-and-entropy::channel-capacity
- channel-coding.md (symbolic) for mit-ocw-information-and-entropy::channel-coding
- error-correcting-codes.md (symbolic) for mit-ocw-information-and-entropy::error-correcting-codes
- cryptography-and-information-hiding.md (symbolic) for mit-ocw-information-and-entropy::cryptography-and-information-hiding
- thermodynamics-and-entropy.md (symbolic) for mit-ocw-information-and-entropy::thermodynamics-and-entropy

View File

@ -0,0 +1,30 @@
# Generated Course Summary
- Pack dir: `/home/netuser/dev/Didactopustry1/domain-packs/mit-ocw-information-entropy`
- Run dir: `/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy-run`
## Curriculum Path Used By The Demo Learner
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy
- mit-ocw-information-and-entropy::counting-and-probability
- mit-ocw-information-and-entropy::shannon-entropy
- mit-ocw-information-and-entropy::mutual-information
- mit-ocw-information-and-entropy::data-compression
- mit-ocw-information-and-entropy::huffman-coding
- mit-ocw-information-and-entropy::channel-capacity
- mit-ocw-information-and-entropy::channel-coding
- mit-ocw-information-and-entropy::error-correcting-codes
- mit-ocw-information-and-entropy::cryptography-and-information-hiding
- mit-ocw-information-and-entropy::thermodynamics-and-entropy
## Mastered Concepts
- mit-ocw-information-and-entropy::channel-capacity
- mit-ocw-information-and-entropy::channel-coding
- mit-ocw-information-and-entropy::counting-and-probability
- mit-ocw-information-and-entropy::cryptography-and-information-hiding
- mit-ocw-information-and-entropy::data-compression
- mit-ocw-information-and-entropy::error-correcting-codes
- mit-ocw-information-and-entropy::huffman-coding
- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy
- mit-ocw-information-and-entropy::mutual-information
- mit-ocw-information-and-entropy::shannon-entropy
- mit-ocw-information-and-entropy::thermodynamics-and-entropy

View File

@ -0,0 +1,238 @@
from __future__ import annotations
import json
from dataclasses import dataclass
from pathlib import Path
import yaml
from .evaluator_pipeline import CritiqueEvaluator, LearnerAttempt, RubricEvaluator, SymbolicRuleEvaluator, aggregate, run_pipeline
@dataclass
class SkillContext:
skill_name: str
skill_description: str
course_summary: str
capability_summary: str
pack: dict
concepts: list[dict]
capability_profile: dict
run_summary: dict
def load_ocw_skill_context(skill_dir: str | Path) -> SkillContext:
skill_dir = Path(skill_dir)
skill_text = (skill_dir / "SKILL.md").read_text(encoding="utf-8")
skill_name = "ocw-information-entropy-agent"
skill_description = ""
lines = skill_text.splitlines()
for idx, line in enumerate(lines):
if line.strip() == "---":
continue
if line.startswith("name:"):
skill_name = line.split(":", 1)[1].strip()
if line.startswith("description:"):
skill_description = line.split(":", 1)[1].strip()
if idx > 10 and skill_description:
break
pack_dir = skill_dir / "assets" / "generated" / "pack"
run_dir = skill_dir / "assets" / "generated" / "run"
return SkillContext(
skill_name=skill_name,
skill_description=skill_description,
course_summary=(skill_dir / "references" / "generated-course-summary.md").read_text(encoding="utf-8"),
capability_summary=(skill_dir / "references" / "generated-capability-summary.md").read_text(encoding="utf-8"),
pack=yaml.safe_load((pack_dir / "pack.yaml").read_text(encoding="utf-8")) or {},
concepts=(yaml.safe_load((pack_dir / "concepts.yaml").read_text(encoding="utf-8")) or {}).get("concepts", []),
capability_profile=json.loads((run_dir / "capability_profile.json").read_text(encoding="utf-8")),
run_summary=json.loads((run_dir / "run_summary.json").read_text(encoding="utf-8")),
)
def _concept_key(pack_name: str, concept_id: str) -> str:
return f"{pack_name}::{concept_id}"
def _match_concepts(context: SkillContext, task: str, limit: int = 3) -> list[dict]:
task_lower = task.lower()
scored = []
for concept in context.concepts:
text = " ".join(
[
str(concept.get("id", "")),
str(concept.get("title", "")),
str(concept.get("description", "")),
]
).lower()
score = sum(1 for token in task_lower.split() if token in text)
if score:
scored.append((score, concept))
scored.sort(key=lambda item: (item[0], len(item[1].get("prerequisites", []))), reverse=True)
return [concept for _, concept in scored[:limit]]
def build_skill_grounded_study_plan(context: SkillContext, target_task: str) -> dict:
pack_name = context.pack.get("name", "mit-ocw-information-and-entropy")
matched = _match_concepts(context, target_task)
if not matched:
matched = [c for c in context.concepts if c.get("id") in {"shannon-entropy", "channel-capacity", "thermodynamics-and-entropy"}]
steps = []
for concept in matched:
concept_id = concept["id"]
concept_key = _concept_key(pack_name, concept_id)
steps.append(
{
"concept_key": concept_key,
"title": concept["title"],
"status": "mastered" if concept_key in context.capability_profile.get("mastered_concepts", []) else "review-needed",
"prerequisites": [
_concept_key(pack_name, prereq) for prereq in concept.get("prerequisites", [])
],
"recommended_action": (
f"Use {concept['title']} as the primary teaching anchor."
if concept_key in context.capability_profile.get("mastered_concepts", [])
else f"Review prerequisites before teaching {concept['title']}."
),
}
)
return {
"skill": context.skill_name,
"task": target_task,
"steps": steps,
"guided_path_reference": list(context.run_summary.get("curriculum_path", [])),
}
def build_skill_grounded_explanation(context: SkillContext, concept_id: str) -> dict:
pack_name = context.pack.get("name", "mit-ocw-information-and-entropy")
concept = next((item for item in context.concepts if item.get("id") == concept_id), None)
if concept is None:
raise KeyError(f"Unknown concept id: {concept_id}")
concept_key = _concept_key(pack_name, concept_id)
summary = context.capability_profile.get("evaluator_summary_by_concept", {}).get(concept_key, {})
explanation = (
f"{concept['title']} is represented in the Information and Entropy skill as part of a progression from "
f"foundational probability ideas toward communication limits and physical interpretation. "
f"It depends on {', '.join(concept.get('prerequisites', []) or ['no explicit prerequisites in the generated pack'])}. "
f"The current demo learner already mastered this concept, with evaluator means {summary}, so the skill can use it as a stable explanation anchor."
)
return {
"concept_key": concept_key,
"title": concept["title"],
"explanation": explanation,
"source_description": concept.get("description", ""),
}
def evaluate_submission_with_skill(context: SkillContext, concept_id: str, submission: str) -> dict:
pack_name = context.pack.get("name", "mit-ocw-information-and-entropy")
concept_key = _concept_key(pack_name, concept_id)
attempt = LearnerAttempt(
concept=concept_key,
artifact_type="symbolic",
content=submission,
metadata={"skill_name": context.skill_name},
)
results = run_pipeline(attempt, [RubricEvaluator(), SymbolicRuleEvaluator(), CritiqueEvaluator()])
aggregated = aggregate(results)
mastered_reference = concept_key in context.capability_profile.get("mastered_concepts", [])
verdict = "acceptable" if aggregated.get("correctness", 0.0) >= 0.75 and aggregated.get("explanation", 0.0) >= 0.75 else "needs_revision"
return {
"concept_key": concept_key,
"submission": submission,
"verdict": verdict,
"aggregated": aggregated,
"evaluators": [
{"name": result.evaluator_name, "dimensions": result.dimensions, "notes": result.notes}
for result in results
],
"skill_reference": {
"skill_name": context.skill_name,
"mastered_by_demo_agent": mastered_reference,
},
"follow_up": (
"Extend the answer with an explicit limitation or assumption."
if verdict == "acceptable"
else "Rework the answer so it states the equality/relationship explicitly and explains why it matters."
),
}
def run_ocw_skill_agent_demo(skill_dir: str | Path, out_dir: str | Path) -> dict:
context = load_ocw_skill_context(skill_dir)
out_dir = Path(out_dir)
out_dir.mkdir(parents=True, exist_ok=True)
study_plan = build_skill_grounded_study_plan(
context,
"Help a learner connect Shannon entropy, channel capacity, and thermodynamic entropy.",
)
explanation = build_skill_grounded_explanation(context, "channel-capacity")
evaluation = evaluate_submission_with_skill(
context,
"thermodynamics-and-entropy",
"Therefore entropy = uncertainty in a message model, but one limitation is that thermodynamic entropy and Shannon entropy are not identical without careful interpretation.",
)
payload = {
"skill": {
"name": context.skill_name,
"description": context.skill_description,
},
"study_plan": study_plan,
"explanation": explanation,
"evaluation": evaluation,
}
(out_dir / "skill_demo.json").write_text(json.dumps(payload, indent=2), encoding="utf-8")
lines = [
"# OCW Information and Entropy Skill Demo",
"",
f"- Skill: `{context.skill_name}`",
f"- Description: {context.skill_description}",
"",
"## Study Plan",
]
for step in study_plan["steps"]:
lines.append(f"- {step['title']} ({step['status']}): {step['recommended_action']}")
lines.extend(
[
"",
"## Explanation Demo",
explanation["explanation"],
"",
"## Evaluation Demo",
f"- Verdict: {evaluation['verdict']}",
f"- Aggregated dimensions: {evaluation['aggregated']}",
f"- Follow-up: {evaluation['follow_up']}",
]
)
(out_dir / "skill_demo.md").write_text("\n".join(lines), encoding="utf-8")
return payload
def main() -> None:
import argparse
root = Path(__file__).resolve().parents[2]
parser = argparse.ArgumentParser(description="Show an agentic system using the Information and Entropy knowledge export as a skill.")
parser.add_argument(
"--skill-dir",
default=str(root / "skills" / "ocw-information-entropy-agent"),
)
parser.add_argument(
"--out-dir",
default=str(root / "examples" / "ocw-information-entropy-skill-demo"),
)
args = parser.parse_args()
payload = run_ocw_skill_agent_demo(args.skill_dir, args.out_dir)
print(json.dumps(payload, indent=2))
if __name__ == "__main__":
main()