diff --git a/README.md b/README.md index 1f64e54..4cbaa24 100644 --- a/README.md +++ b/README.md @@ -29,6 +29,7 @@ Then open: - `examples/ocw-information-entropy-run/learner_progress.html` - `examples/ocw-information-entropy-skill-demo/skill_demo.md` +- `examples/ocw-information-entropy-rolemesh-transcript/rolemesh_transcript.md` - `skills/ocw-information-entropy-agent/` That gives you: @@ -37,6 +38,7 @@ That gives you: - a visible learning path - progress artifacts - a reusable skill grounded in the exported knowledge +- a transcript showing how a local-LLM-backed learner/mentor interaction can look The point is not to replace your effort. The point is to give your effort structure, feedback, and momentum. @@ -197,6 +199,7 @@ Primary outputs: - `review_report.md` - `conflict_report.md` - `license_attribution.json` +- `pack_compliance_manifest.json` when a source inventory is provided ### 2. Review and workspace management @@ -264,7 +267,24 @@ Key capabilities: - markdown capability reports - artifact manifests -### 5. Agentic learner demos and visualization +### 5. Local model integration + +Didactopus can now target a RoleMesh Gateway-backed local LLM setup through its `ModelProvider` abstraction. + +Main modules: + +- `didactopus.model_provider` +- `didactopus.role_prompts` +- `didactopus.rolemesh_demo` + +What this enables: + +- role-based local model routing +- separate mentor/practice/project-advisor/evaluator prompts +- local heterogeneous compute usage through an OpenAI-compatible gateway +- a clean path to keep tutoring assistance structured instead of offloading learner work + +### 6. Agentic learner demos and visualization The repository includes deterministic agentic demos rather than a live external model integration. @@ -306,6 +326,43 @@ This writes: - `examples/ocw-information-entropy-run/` - `skills/ocw-information-entropy-agent/` +The generated MIT OCW pack also includes: + +- `license_attribution.json` +- `pack_compliance_manifest.json` +- `source_inventory.yaml` + +### Try the local RoleMesh integration path + +Stubbed local-provider demo: + +```bash +python -m didactopus.rolemesh_demo --config configs/config.example.yaml +``` + +RoleMesh-backed example config: + +```bash +python -m didactopus.rolemesh_demo --config configs/config.rolemesh.example.yaml +``` + +MIT OCW learner transcript through the local-LLM path: + +```bash +python -m didactopus.ocw_rolemesh_transcript_demo --config configs/config.rolemesh.example.yaml +``` + +If your local models are slow, Didactopus now prints pending-status lines while each mentor, practice, learner, or evaluator turn is being generated. For a long manual run, capture both the transcript payload and those live status messages: + +```bash +python -u -m didactopus.ocw_rolemesh_transcript_demo \ + --config configs/config.rolemesh.example.yaml \ + --out-dir examples/ocw-information-entropy-rolemesh-transcript \ + 2>&1 | tee examples/ocw-information-entropy-rolemesh-transcript/manual-run.log +``` + +That command leaves the final transcript in `rolemesh_transcript.md` and `rolemesh_transcript.json`, while `manual-run.log` preserves the conversational “working on it” notices during the wait. + ### Render learner progress visualizations Path-focused view: @@ -357,6 +414,8 @@ What remains heuristic or lightweight: - [docs/mastery-ledger.md](docs/mastery-ledger.md) - [docs/workspace-manager.md](docs/workspace-manager.md) - [docs/interactive-review-ui.md](docs/interactive-review-ui.md) +- [docs/mit-ocw-course-guide.md](docs/mit-ocw-course-guide.md) +- [docs/rolemesh-integration.md](docs/rolemesh-integration.md) - [docs/faq.md](docs/faq.md) ## MIT OCW Demo Notes diff --git a/configs/config.example.yaml b/configs/config.example.yaml index 9c20d36..f40ca38 100644 --- a/configs/config.example.yaml +++ b/configs/config.example.yaml @@ -6,3 +6,8 @@ bridge: port: 8765 registry_path: "workspace_registry.json" default_workspace_root: "workspaces" +model_provider: + provider: "stub" + local: + backend: "stub" + model_name: "local-demo" diff --git a/configs/config.rolemesh.example.yaml b/configs/config.rolemesh.example.yaml new file mode 100644 index 0000000..a7f75aa --- /dev/null +++ b/configs/config.rolemesh.example.yaml @@ -0,0 +1,26 @@ +review: + default_reviewer: "Wesley R. Elsberry" + write_promoted_pack: true + +bridge: + host: "127.0.0.1" + port: 8765 + registry_path: "workspace_registry.json" + default_workspace_root: "workspaces" + +model_provider: + provider: "rolemesh" + local: + backend: "stub" + model_name: "unused-when-rolemesh-enabled" + rolemesh: + base_url: "http://127.0.0.1:8000" + api_key: "change-me-client-key-1" + default_model: "planner" + role_to_model: + mentor: "planner" + learner: "writer" + practice: "writer" + project_advisor: "planner" + evaluator: "reviewer" + timeout_seconds: 30.0 diff --git a/docs/course-to-pack.md b/docs/course-to-pack.md index e50beb1..5f2ad6d 100644 --- a/docs/course-to-pack.md +++ b/docs/course-to-pack.md @@ -50,6 +50,11 @@ The pack emitter writes: - `review_report.md` - `conflict_report.md` - `license_attribution.json` +- `source_corpus.json` + +`source_corpus.json` is the main grounded-text artifact. It preserves lesson bodies, objectives, +exercises, and source references from the ingested material so downstream tutoring or evaluation +can rely on source-derived text instead of only the distilled concept graph. ## Rule layer @@ -77,4 +82,4 @@ The end-to-end reference flow in this repository is: python -m didactopus.ocw_information_entropy_demo ``` -That command ingests the MIT OCW Information and Entropy source file in `examples/ocw-information-entropy/`, emits a draft pack into `domain-packs/mit-ocw-information-entropy/`, runs a deterministic agentic learner over the generated path, and writes downstream skill/visualization artifacts. +That command ingests the MIT OCW Information and Entropy source file or directory tree in `examples/ocw-information-entropy/`, emits a draft pack into `domain-packs/mit-ocw-information-entropy/`, writes a grounded `source_corpus.json`, runs a deterministic agentic learner over the generated path, and writes downstream skill/visualization artifacts. diff --git a/docs/rolemesh-integration.md b/docs/rolemesh-integration.md new file mode 100644 index 0000000..7c2289d --- /dev/null +++ b/docs/rolemesh-integration.md @@ -0,0 +1,111 @@ +# RoleMesh Integration + +RoleMesh Gateway is an appropriate dependency for local-LLM-backed Didactopus usage. + +## Why it fits + +The local RoleMesh codebase provides exactly the main things Didactopus needs for a local heterogeneous inference setup: + +- OpenAI-compatible `POST /v1/chat/completions` +- role-based model routing +- local or multi-host upstream registration +- flexible model loading and switching through the gateway/node-agent split +- per-role defaults for temperature and other request settings + +That means Didactopus can keep a simple provider abstraction while delegating model placement and routing to RoleMesh. + +## Recommended architecture + +1. Run RoleMesh Gateway as the OpenAI-compatible front door. +2. Point RoleMesh roles at local backends or discovered node agents. +3. Configure Didactopus to use the `rolemesh` model provider. +4. Let Didactopus send mentor/practice/project-advisor/evaluator requests by role. + +## Didactopus-side config + +Use `configs/config.rolemesh.example.yaml` as the starting point. + +The important fields are: + +- `model_provider.provider: rolemesh` +- `model_provider.rolemesh.base_url` +- `model_provider.rolemesh.api_key` +- `model_provider.rolemesh.default_model` +- `model_provider.rolemesh.role_to_model` + +## Suggested role mapping + +With the sample RoleMesh gateway config, this is a good default mapping: + +- `mentor -> planner` +- `practice -> writer` +- `project_advisor -> planner` +- `evaluator -> reviewer` + +This keeps Didactopus prompts aligned with the role semantics RoleMesh already exposes. + +## Prompt layer + +Didactopus now keeps its default RoleMesh-oriented prompts in: + +- `didactopus.role_prompts` + +These prompts are intentionally anti-offloading: + +- mentor mode prefers Socratic questions and hints +- practice mode prefers reasoning-heavy tasks +- project-advisor mode prefers synthesis work +- evaluator mode prefers critique and explicit limitations + +## Demo command + +To exercise the integration path without a live RoleMesh gateway, run: + +```bash +python -m didactopus.rolemesh_demo --config configs/config.example.yaml +``` + +That uses the stub provider path. + +To point at a live RoleMesh deployment, start from: + +```bash +python -m didactopus.rolemesh_demo --config configs/config.rolemesh.example.yaml +``` + +and replace the placeholder gateway URL/API key with your real local setup. + +## Example transcript + +The repository now includes a generated transcript of an AI learner using the local-LLM path to approach the MIT OCW Information and Entropy course: + +- `examples/ocw-information-entropy-rolemesh-transcript/rolemesh_transcript.md` + +Generator command: + +```bash +python -m didactopus.ocw_rolemesh_transcript_demo --config configs/config.rolemesh.example.yaml +``` + +If some RoleMesh aliases are unhealthy, the transcript demo automatically falls back to the healthy local alias and records that in the output metadata. + +If local inference is slow, the transcript demo now emits pending notices such as “Didactopus is evaluating the work before replying” while each turn is still running. For a full manual capture, run: + +```bash +python -u -m didactopus.ocw_rolemesh_transcript_demo \ + --config configs/config.rolemesh.example.yaml \ + --out-dir examples/ocw-information-entropy-rolemesh-transcript \ + 2>&1 | tee examples/ocw-information-entropy-rolemesh-transcript/manual-run.log +``` + +That gives you three artifacts: + +- `rolemesh_transcript.json` +- `rolemesh_transcript.md` +- `manual-run.log` with the live “pending” status messages + +For slower larger models, expect the transcript run to take several minutes rather than seconds. The command above is the recommended way to capture the whole session outside Codex. + +## Gateway-side note + +This repository does not vendor RoleMesh. It assumes the local RoleMesh codebase or deployment exists separately. The reference local codebase mentioned by the user is suitable because it already provides the API and routing semantics Didactopus needs. diff --git a/domain-packs/mit-ocw-information-entropy/concepts.yaml b/domain-packs/mit-ocw-information-entropy/concepts.yaml index e6a84ed..b033978 100644 --- a/domain-packs/mit-ocw-information-entropy/concepts.yaml +++ b/domain-packs/mit-ocw-information-entropy/concepts.yaml @@ -1,54 +1,203 @@ concepts: -- id: mit-ocw-6-050j-information-and-entropy - title: MIT OCW 6.050J Information and Entropy - description: 'Source: MIT OpenCourseWare 6.050J Information and Entropy, Spring - 2008. +- id: mit-ocw-6-050j-information-and-entropy-course-home + title: 'MIT OCW 6.050J Information and Entropy: Course Home' + description: 'Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/ - Attribution: adapted from the OCW course overview, unit sequence, and assigned - textbook references.' + Attribution: adapted from the MIT OpenCourseWare course home page for 6.050J Information + and Entropy.' prerequisites: [] mastery_signals: [] mastery_profile: {} -- id: information - title: Information - description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information - and Entropy'. +- id: information-and-entropy + title: Information and Entropy + description: '- Objective: Identify the course title, instructors, departments, + level, and major topical areas. + + - Exercise: Summarize the course in one paragraph for a prospective learner. + + MIT OpenCourseWare presents 6.050J Information and Entropy as a S' + prerequisites: + - mit-ocw-6-050j-information-and-entropy-course-home + mastery_signals: + - Identify the course title, instructors, departments, level, and major topical + areas. + mastery_profile: {} +- id: paul + title: Paul + description: Candidate concept extracted from lesson 'Information and Entropy'. prerequisites: [] - mastery_signals: [] + mastery_signals: + - Identify the course title, instructors, departments, level, and major topical + areas. + mastery_profile: {} +- id: penfield + title: Penfield + description: Candidate concept extracted from lesson 'Information and Entropy'. + prerequisites: [] + mastery_signals: + - Identify the course title, instructors, departments, level, and major topical + areas. + mastery_profile: {} +- id: seth + title: Seth + description: Candidate concept extracted from lesson 'Information and Entropy'. + prerequisites: [] + mastery_signals: + - Identify the course title, instructors, departments, level, and major topical + areas. + mastery_profile: {} +- id: lloyd + title: Lloyd + description: Candidate concept extracted from lesson 'Information and Entropy'. + prerequisites: [] + mastery_signals: + - Identify the course title, instructors, departments, level, and major topical + areas. + mastery_profile: {} +- id: electrical + title: Electrical + description: Candidate concept extracted from lesson 'Information and Entropy'. + prerequisites: [] + mastery_signals: + - Identify the course title, instructors, departments, level, and major topical + areas. + mastery_profile: {} +- id: engineering + title: Engineering + description: Candidate concept extracted from lesson 'Information and Entropy'. + prerequisites: [] + mastery_signals: + - Identify the course title, instructors, departments, level, and major topical + areas. + mastery_profile: {} +- id: ultimate-limits-to-communication-and-computation + title: Ultimate Limits to Communication and Computation + description: '- Objective: Explain the broad intellectual scope of the course. + + - Exercise: List the main topic clusters that connect communication, computation, + and entropy. + + The course examines the ultimate limits to communication and computation with + em' + prerequisites: + - information-and-entropy + mastery_signals: + - Explain the broad intellectual scope of the course. mastery_profile: {} - id: entropy title: Entropy - description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information - and Entropy'. + description: Candidate concept extracted from lesson 'Ultimate Limits to Communication + and Computation'. prerequisites: [] + mastery_signals: + - Explain the broad intellectual scope of the course. + mastery_profile: {} +- id: open-textbooks-problem-sets-and-programming-work + title: Open Textbooks, Problem Sets, and Programming Work + description: '- Objective: Identify the main kinds of learning resources supplied + through the course. + + - Exercise: Explain how these resource types support both conceptual study and + practice. + + The course home lists open textbooks, problem sets, problem set' + prerequisites: + - ultimate-limits-to-communication-and-computation + mastery_signals: + - Identify the main kinds of learning resources supplied through the course. + mastery_profile: {} +- id: mit-ocw-6-050j-information-and-entropy-syllabus + title: 'MIT OCW 6.050J Information and Entropy: Syllabus' + description: 'Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/ + + Attribution: adapted from the MIT OpenCourseWare syllabus page for 6.050J Information + and Entropy.' + prerequisites: + - open-textbooks-problem-sets-and-programming-work mastery_signals: [] mastery_profile: {} -- id: source - title: Source - description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information - and Entropy'. - prerequisites: [] - mastery_signals: [] +- id: prerequisites-and-mathematical-background + title: Prerequisites and Mathematical Background + description: '- Objective: Explain the mathematical maturity expected by the course. + + - Exercise: Decide whether a learner needs review in probability, linear algebra, + or signals before beginning. + + The syllabus expects a foundation comparable to MIT subjec' + prerequisites: + - mit-ocw-6-050j-information-and-entropy-syllabus + mastery_signals: + - Explain the mathematical maturity expected by the course. mastery_profile: {} -- id: opencourseware - title: OpenCourseWare - description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information - and Entropy'. - prerequisites: [] - mastery_signals: [] +- id: assessment-structure + title: Assessment Structure + description: '- Objective: Identify the role of problem sets, exams, and programming + work in the course. + + - Exercise: Build a study schedule that alternates reading, derivation, and worked + exercises. + + The syllabus emphasizes regular problem solving and qua' + prerequisites: + - prerequisites-and-mathematical-background + mastery_signals: + - Identify the role of problem sets, exams, and programming work in the course. mastery_profile: {} -- id: spring - title: Spring - description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information - and Entropy'. - prerequisites: [] - mastery_signals: [] +- id: course-notes-and-reference-texts + title: Course Notes and Reference Texts + description: '- Objective: Explain how the course notes and textbook references + supply the core conceptual sequence. + + - Exercise: Compare when to use course notes versus outside references for clarification. + + MIT OCW links course notes and textbook-style r' + prerequisites: + - assessment-structure + mastery_signals: + - Explain how the course notes and textbook references supply the core conceptual + sequence. mastery_profile: {} -- id: attribution - title: Attribution - description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information - and Entropy'. +- id: independent-reasoning-and-careful-comparison + title: Independent Reasoning and Careful Comparison + description: '- Objective: Explain why the course requires precise comparison of + related but non-identical concepts. + + - Exercise: Write a short note distinguishing Shannon entropy, channel capacity, + and thermodynamic entropy. + + The syllabus framing implies' + prerequisites: + - course-notes-and-reference-texts + mastery_signals: + - Explain why the course requires precise comparison of related but non-identical + concepts. + mastery_profile: {} +- id: shannon + title: Shannon + description: Candidate concept extracted from lesson 'Independent Reasoning and + Careful Comparison'. prerequisites: [] + mastery_signals: + - Explain why the course requires precise comparison of related but non-identical + concepts. + mastery_profile: {} +- id: learners + title: Learners + description: Candidate concept extracted from lesson 'Independent Reasoning and + Careful Comparison'. + prerequisites: [] + mastery_signals: + - Explain why the course requires precise comparison of related but non-identical + concepts. + mastery_profile: {} +- id: mit-ocw-6-050j-information-and-entropy-unit-sequence + title: 'MIT OCW 6.050J Information and Entropy: Unit Sequence' + description: 'Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/ + + Attribution: adapted from the MIT OpenCourseWare unit progression and resource + organization for 6.050J Information and Entropy.' + prerequisites: + - independent-reasoning-and-careful-comparison mastery_signals: [] mastery_profile: {} - id: counting-and-probability @@ -59,284 +208,132 @@ concepts: - Exercise: Derive a simple counting argument for binary strings and compute an event probability. - This lesson i' + Early units e' prerequisites: - - mit-ocw-6-050j-information-and-entropy - mastery_signals: [] - mastery_profile: {} -- id: counting - title: Counting - description: Candidate concept extracted from lesson 'Counting and Probability'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: probability - title: Probability - description: Candidate concept extracted from lesson 'Counting and Probability'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: objective - title: Objective - description: Candidate concept extracted from lesson 'Counting and Probability'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: explain - title: Explain - description: Candidate concept extracted from lesson 'Counting and Probability'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: exercise - title: Exercise - description: Candidate concept extracted from lesson 'Counting and Probability'. - prerequisites: [] - mastery_signals: [] + - mit-ocw-6-050j-information-and-entropy-unit-sequence + mastery_signals: + - Explain how counting arguments, probability spaces, and random variables support + later information-theory results. mastery_profile: {} - id: derive title: Derive description: Candidate concept extracted from lesson 'Counting and Probability'. prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: this - title: This - description: Candidate concept extracted from lesson 'Counting and Probability'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: random - title: Random - description: Candidate concept extracted from lesson 'Counting and Probability'. - prerequisites: [] - mastery_signals: [] + mastery_signals: + - Explain how counting arguments, probability spaces, and random variables support + later information-theory results. mastery_profile: {} - id: shannon-entropy title: Shannon Entropy - description: '- Objective: Explain Shannon Entropy as a measure of uncertainty and + description: '- Objective: Explain Shannon entropy as a measure of uncertainty and compare high-entropy and low-entropy sources. - Exercise: Compute the entropy of a Bernoulli source and interpret the result. - This lesson centers Shannon Entropy, Surprise' + The course then introduces entropy as a quant' prerequisites: - counting-and-probability - mastery_signals: [] - mastery_profile: {} -- id: shannon - title: Shannon - description: Candidate concept extracted from lesson 'Shannon Entropy'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: compute - title: Compute - description: Candidate concept extracted from lesson 'Shannon Entropy'. - prerequisites: [] - mastery_signals: [] + mastery_signals: + - Explain Shannon entropy as a measure of uncertainty and compare high-entropy and + low-entropy sources. mastery_profile: {} - id: bernoulli title: Bernoulli description: Candidate concept extracted from lesson 'Shannon Entropy'. prerequisites: [] - mastery_signals: [] + mastery_signals: + - Explain Shannon entropy as a measure of uncertainty and compare high-entropy and + low-entropy sources. mastery_profile: {} - id: mutual-information title: Mutual Information - description: '- Objective: Explain Mutual Information and relate it to dependence - between signals. + description: '- Objective: Explain mutual information and relate it to dependence + between signals or observations. - Exercise: Compare independent variables with dependent variables using mutual-information reasoning. - This lesson introduces Mutual Information, Dependenc' + These units ask the learner to under' prerequisites: - shannon-entropy - mastery_signals: [] + mastery_signals: + - Explain mutual information and relate it to dependence between signals or observations. mastery_profile: {} -- id: mutual - title: Mutual - description: Candidate concept extracted from lesson 'Mutual Information'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: compare - title: Compare - description: Candidate concept extracted from lesson 'Mutual Information'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: dependence - title: Dependence - description: Candidate concept extracted from lesson 'Mutual Information'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: data-compression - title: Data Compression - description: '- Objective: Explain lossless compression in terms of entropy and - typical structure. +- id: source-coding-and-compression + title: Source Coding and Compression + description: '- Objective: Explain lossless compression in terms of entropy, redundancy, + and coding choices. - Exercise: Describe when compression succeeds and when it fails on already-random data. - This lesson covers Data Compression, Redundancy, and Efficient Rep' + The course develops the idea that structured sources can' prerequisites: - mutual-information - mastery_signals: [] - mastery_profile: {} -- id: data - title: Data - description: Candidate concept extracted from lesson 'Data Compression'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: compression - title: Compression - description: Candidate concept extracted from lesson 'Data Compression'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: describe - title: Describe - description: Candidate concept extracted from lesson 'Data Compression'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: redundancy - title: Redundancy - description: Candidate concept extracted from lesson 'Data Compression'. - prerequisites: [] - mastery_signals: [] + mastery_signals: + - Explain lossless compression in terms of entropy, redundancy, and coding choices. mastery_profile: {} - id: huffman-coding title: Huffman Coding - description: '- Objective: Explain Huffman Coding and justify why shorter codewords - should track more likely symbols. + description: '- Objective: Explain Huffman coding and justify why likely symbols + receive shorter descriptions. - Exercise: Build a Huffman code for a small source alphabet. - This lesson focuses on Huffman Coding, Prefix Codes, and Expected Length.' + Learners use trees and expected length arguments to connect probability models + to' prerequisites: - - data-compression - mastery_signals: [] - mastery_profile: {} -- id: huffman - title: Huffman - description: Candidate concept extracted from lesson 'Huffman Coding'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: coding - title: Coding - description: Candidate concept extracted from lesson 'Huffman Coding'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: build - title: Build - description: Candidate concept extracted from lesson 'Huffman Coding'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: prefix - title: Prefix - description: Candidate concept extracted from lesson 'Huffman Coding'. - prerequisites: [] - mastery_signals: [] + - source-coding-and-compression + mastery_signals: + - Explain Huffman coding and justify why likely symbols receive shorter descriptions. mastery_profile: {} - id: channel-capacity title: Channel Capacity - description: '- Objective: Explain Channel Capacity as a limit on reliable communication - over noisy channels. + description: '- Objective: Explain channel capacity as a limit on reliable communication + over a noisy channel. - Exercise: State why reliable transmission above capacity is impossible in the long run. - This lesson develops Channel Capacity, Reliable Commun' + The course treats capacity as a fundamental upper bou' prerequisites: - huffman-coding - mastery_signals: [] - mastery_profile: {} -- id: channel - title: Channel - description: Candidate concept extracted from lesson 'Channel Capacity'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: capacity - title: Capacity - description: Candidate concept extracted from lesson 'Channel Capacity'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: state - title: State - description: Candidate concept extracted from lesson 'Channel Capacity'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: reliable - title: Reliable - description: Candidate concept extracted from lesson 'Channel Capacity'. - prerequisites: [] - mastery_signals: [] + mastery_signals: + - Explain channel capacity as a limit on reliable communication over a noisy channel. mastery_profile: {} - id: channel-coding title: Channel Coding - description: '- Objective: Explain how Channel Coding adds structure that protects - messages against noise. + description: '- Objective: Explain how channel coding adds redundancy to protect + messages from noise. - Exercise: Contrast uncoded transmission with coded transmission on a noisy channel. - This lesson connects Channel Coding, Decoding, and Reliabilit' + These units emphasize that redundancy can be wasteful in compressi' prerequisites: - channel-capacity - mastery_signals: [] + mastery_signals: + - Explain how channel coding adds redundancy to protect messages from noise. mastery_profile: {} - id: contrast title: Contrast description: Candidate concept extracted from lesson 'Channel Coding'. prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: decoding - title: Decoding - description: Candidate concept extracted from lesson 'Channel Coding'. - prerequisites: [] - mastery_signals: [] + mastery_signals: + - Explain how channel coding adds redundancy to protect messages from noise. mastery_profile: {} - id: error-correcting-codes title: Error Correcting Codes - description: '- Objective: Explain how Error Correcting Codes detect or correct - symbol corruption. + description: '- Objective: Explain how error-correcting codes detect or repair corrupted + symbols. - Exercise: Describe a simple parity-style code and its limits. - This lesson covers Error Correcting Codes, Parity, and Syndrome-style reasoning. - The learne' + The learner must connect abstract limits to concrete coding mechanisms and understand + both s' prerequisites: - channel-coding - mastery_signals: [] - mastery_profile: {} -- id: error - title: Error - description: Candidate concept extracted from lesson 'Error Correcting Codes'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: correcting - title: Correcting - description: Candidate concept extracted from lesson 'Error Correcting Codes'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: codes - title: Codes - description: Candidate concept extracted from lesson 'Error Correcting Codes'. - prerequisites: [] - mastery_signals: [] + mastery_signals: + - Explain how error-correcting codes detect or repair corrupted symbols. mastery_profile: {} - id: cryptography-and-information-hiding title: Cryptography and Information Hiding @@ -345,24 +342,11 @@ concepts: - Exercise: Compare a secure scheme with a weak one in terms of revealed information. - This lesson combines Cryptography, Information Leakag' + The course extends information-theoretic reasoning to' prerequisites: - error-correcting-codes - mastery_signals: [] - mastery_profile: {} -- id: cryptography - title: Cryptography - description: Candidate concept extracted from lesson 'Cryptography and Information - Hiding'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: hiding - title: Hiding - description: Candidate concept extracted from lesson 'Cryptography and Information - Hiding'. - prerequisites: [] - mastery_signals: [] + mastery_signals: + - Explain the relationship between secrecy, information leakage, and coded communication. mastery_profile: {} - id: thermodynamics-and-entropy title: Thermodynamics and Entropy @@ -372,49 +356,37 @@ concepts: - Exercise: Compare the two entropy notions and identify what is preserved across the analogy. - This lesson connects Thermodynamics, Entropy, and P' + The course uses entropy as a bridge concept between' prerequisites: - cryptography-and-information-hiding - mastery_signals: [] + mastery_signals: + - Explain how thermodynamic entropy relates to, and differs from, Shannon entropy. mastery_profile: {} -- id: thermodynamics - title: Thermodynamics - description: Candidate concept extracted from lesson 'Thermodynamics and Entropy'. - prerequisites: [] - mastery_signals: [] +- id: reversible-computation-and-quantum-computation + title: Reversible Computation and Quantum Computation + description: '- Objective: Explain why the physical implementation of computation + matters for information processing limits. + + - Exercise: Summarize how reversible computation changes the discussion of dissipation + and information loss. + + Later units connect' + prerequisites: + - thermodynamics-and-entropy + mastery_signals: + - Explain why the physical implementation of computation matters for information + processing limits. mastery_profile: {} - id: course-synthesis title: Course Synthesis description: '- Objective: Synthesize the course by connecting entropy, coding, - reliability, and physical interpretation in one coherent narrative. + reliability, secrecy, and physical interpretation in one coherent narrative. - Exercise: Produce a final study guide that links source coding, channel coding, - secrecy, and thermodynam' + secrecy, thermo' prerequisites: - - thermodynamics-and-entropy - mastery_signals: [] - mastery_profile: {} -- id: course - title: Course - description: Candidate concept extracted from lesson 'Course Synthesis'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: synthesis - title: Synthesis - description: Candidate concept extracted from lesson 'Course Synthesis'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: synthesize - title: Synthesize - description: Candidate concept extracted from lesson 'Course Synthesis'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: produce - title: Produce - description: Candidate concept extracted from lesson 'Course Synthesis'. - prerequisites: [] - mastery_signals: [] + - reversible-computation-and-quantum-computation + mastery_signals: + - Synthesize the course by connecting entropy, coding, reliability, secrecy, and + physical interpretation in one coherent narrative. mastery_profile: {} diff --git a/domain-packs/mit-ocw-information-entropy/license_attribution.json b/domain-packs/mit-ocw-information-entropy/license_attribution.json index c2ffc0c..7b35fd3 100644 --- a/domain-packs/mit-ocw-information-entropy/license_attribution.json +++ b/domain-packs/mit-ocw-information-entropy/license_attribution.json @@ -2,9 +2,19 @@ "rights_note": "Derived from MIT OpenCourseWare 6.050J Information and Entropy (Spring 2008). Retain MIT OCW attribution and applicable Creative Commons terms before redistribution.", "sources": [ { - "source_path": "/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy/6-050j-information-and-entropy.md", + "source_path": "examples/ocw-information-entropy/course/course-home.md", "source_type": "markdown", - "title": "6 050J Information And Entropy" + "title": "Course Home" + }, + { + "source_path": "examples/ocw-information-entropy/course/syllabus.md", + "source_type": "markdown", + "title": "Syllabus" + }, + { + "source_path": "examples/ocw-information-entropy/course/unit-sequence.md", + "source_type": "markdown", + "title": "Unit Sequence" } ] } \ No newline at end of file diff --git a/domain-packs/mit-ocw-information-entropy/pack.yaml b/domain-packs/mit-ocw-information-entropy/pack.yaml index fa010da..7317be6 100644 --- a/domain-packs/mit-ocw-information-entropy/pack.yaml +++ b/domain-packs/mit-ocw-information-entropy/pack.yaml @@ -12,3 +12,5 @@ dependencies: [] overrides: [] profile_templates: {} cross_pack_links: [] +supporting_artifacts: +- source_corpus.json diff --git a/domain-packs/mit-ocw-information-entropy/pack_compliance_manifest.json b/domain-packs/mit-ocw-information-entropy/pack_compliance_manifest.json index 6c71ee5..d64c7b1 100644 --- a/domain-packs/mit-ocw-information-entropy/pack_compliance_manifest.json +++ b/domain-packs/mit-ocw-information-entropy/pack_compliance_manifest.json @@ -3,6 +3,7 @@ "display_name": "MIT OCW Information and Entropy", "derived_from_sources": [ "mit-ocw-6-050j-course-home", + "mit-ocw-6-050j-syllabus", "mit-ocw-6-050j-unit-8-textbook", "mit-ocw-6-050j-unit-13-textbook" ], diff --git a/domain-packs/mit-ocw-information-entropy/review_report.md b/domain-packs/mit-ocw-information-entropy/review_report.md index fcc415c..989beb1 100644 --- a/domain-packs/mit-ocw-information-entropy/review_report.md +++ b/domain-packs/mit-ocw-information-entropy/review_report.md @@ -1,59 +1,5 @@ # Review Report -- Module 'Imported from MARKDOWN' has no explicit exercises; mastery signals may be weak. -- Concept 'MIT OCW 6.050J Information and Entropy' has no extracted mastery signals; review manually. -- Concept 'Information' has no extracted mastery signals; review manually. -- Concept 'Entropy' has no extracted mastery signals; review manually. -- Concept 'Source' has no extracted mastery signals; review manually. -- Concept 'OpenCourseWare' has no extracted mastery signals; review manually. -- Concept 'Spring' has no extracted mastery signals; review manually. -- Concept 'Attribution' has no extracted mastery signals; review manually. -- Concept 'Counting and Probability' has no extracted mastery signals; review manually. -- Concept 'Counting' has no extracted mastery signals; review manually. -- Concept 'Probability' has no extracted mastery signals; review manually. -- Concept 'Objective' has no extracted mastery signals; review manually. -- Concept 'Explain' has no extracted mastery signals; review manually. -- Concept 'Exercise' has no extracted mastery signals; review manually. -- Concept 'Derive' has no extracted mastery signals; review manually. -- Concept 'This' has no extracted mastery signals; review manually. -- Concept 'Random' has no extracted mastery signals; review manually. -- Concept 'Shannon Entropy' has no extracted mastery signals; review manually. -- Concept 'Shannon' has no extracted mastery signals; review manually. -- Concept 'Compute' has no extracted mastery signals; review manually. -- Concept 'Bernoulli' has no extracted mastery signals; review manually. -- Concept 'Mutual Information' has no extracted mastery signals; review manually. -- Concept 'Mutual' has no extracted mastery signals; review manually. -- Concept 'Compare' has no extracted mastery signals; review manually. -- Concept 'Dependence' has no extracted mastery signals; review manually. -- Concept 'Data Compression' has no extracted mastery signals; review manually. -- Concept 'Data' has no extracted mastery signals; review manually. -- Concept 'Compression' has no extracted mastery signals; review manually. -- Concept 'Describe' has no extracted mastery signals; review manually. -- Concept 'Redundancy' has no extracted mastery signals; review manually. -- Concept 'Huffman Coding' has no extracted mastery signals; review manually. -- Concept 'Huffman' has no extracted mastery signals; review manually. -- Concept 'Coding' has no extracted mastery signals; review manually. -- Concept 'Build' has no extracted mastery signals; review manually. -- Concept 'Prefix' has no extracted mastery signals; review manually. -- Concept 'Channel Capacity' has no extracted mastery signals; review manually. -- Concept 'Channel' has no extracted mastery signals; review manually. -- Concept 'Capacity' has no extracted mastery signals; review manually. -- Concept 'State' has no extracted mastery signals; review manually. -- Concept 'Reliable' has no extracted mastery signals; review manually. -- Concept 'Channel Coding' has no extracted mastery signals; review manually. -- Concept 'Contrast' has no extracted mastery signals; review manually. -- Concept 'Decoding' has no extracted mastery signals; review manually. -- Concept 'Error Correcting Codes' has no extracted mastery signals; review manually. -- Concept 'Error' has no extracted mastery signals; review manually. -- Concept 'Correcting' has no extracted mastery signals; review manually. -- Concept 'Codes' has no extracted mastery signals; review manually. -- Concept 'Cryptography and Information Hiding' has no extracted mastery signals; review manually. -- Concept 'Cryptography' has no extracted mastery signals; review manually. -- Concept 'Hiding' has no extracted mastery signals; review manually. -- Concept 'Thermodynamics and Entropy' has no extracted mastery signals; review manually. -- Concept 'Thermodynamics' has no extracted mastery signals; review manually. -- Concept 'Course Synthesis' has no extracted mastery signals; review manually. -- Concept 'Course' has no extracted mastery signals; review manually. -- Concept 'Synthesis' has no extracted mastery signals; review manually. -- Concept 'Synthesize' has no extracted mastery signals; review manually. -- Concept 'Produce' has no extracted mastery signals; review manually. \ No newline at end of file +- Concept 'MIT OCW 6.050J Information and Entropy: Course Home' has no extracted mastery signals; review manually. +- Concept 'MIT OCW 6.050J Information and Entropy: Syllabus' has no extracted mastery signals; review manually. +- Concept 'MIT OCW 6.050J Information and Entropy: Unit Sequence' has no extracted mastery signals; review manually. \ No newline at end of file diff --git a/domain-packs/mit-ocw-information-entropy/roadmap.yaml b/domain-packs/mit-ocw-information-entropy/roadmap.yaml index 862dccb..5c13c99 100644 --- a/domain-packs/mit-ocw-information-entropy/roadmap.yaml +++ b/domain-packs/mit-ocw-information-entropy/roadmap.yaml @@ -2,16 +2,50 @@ stages: - id: stage-1 title: Imported from MARKDOWN concepts: - - mit-ocw-6-050j-information-and-entropy + - mit-ocw-6-050j-information-and-entropy-course-home + - information-and-entropy + - ultimate-limits-to-communication-and-computation + - open-textbooks-problem-sets-and-programming-work + - mit-ocw-6-050j-information-and-entropy-syllabus + - prerequisites-and-mathematical-background + - assessment-structure + - course-notes-and-reference-texts + - independent-reasoning-and-careful-comparison + - mit-ocw-6-050j-information-and-entropy-unit-sequence - counting-and-probability - shannon-entropy - mutual-information - - data-compression + - source-coding-and-compression - huffman-coding - channel-capacity - channel-coding - error-correcting-codes - cryptography-and-information-hiding - thermodynamics-and-entropy + - reversible-computation-and-quantum-computation - course-synthesis - checkpoint: [] + checkpoint: + - Summarize the course in one paragraph for a prospective learner. + - List the main topic clusters that connect communication, computation, and entropy. + - Explain how these resource types support both conceptual study and practice. + - Decide whether a learner needs review in probability, linear algebra, or signals + before beginning. + - Build a study schedule that alternates reading, derivation, and worked exercises. + - Compare when to use course notes versus outside references for clarification. + - Write a short note distinguishing Shannon entropy, channel capacity, and thermodynamic + entropy. + - Derive a simple counting argument for binary strings and compute an event probability. + - Compute the entropy of a Bernoulli source and interpret the result. + - Compare independent variables with dependent variables using mutual-information + reasoning. + - Describe when compression succeeds and when it fails on already-random data. + - Build a Huffman code for a small source alphabet. + - State why reliable transmission above capacity is impossible in the long run. + - Contrast uncoded transmission with coded transmission on a noisy channel. + - Describe a simple parity-style code and its limits. + - Compare a secure scheme with a weak one in terms of revealed information. + - Compare the two entropy notions and identify what is preserved across the analogy. + - Summarize how reversible computation changes the discussion of dissipation and + information loss. + - Produce a final study guide that links source coding, channel coding, secrecy, + thermodynamic analogies, and computation. diff --git a/domain-packs/mit-ocw-information-entropy/source_corpus.json b/domain-packs/mit-ocw-information-entropy/source_corpus.json new file mode 100644 index 0000000..beb558a --- /dev/null +++ b/domain-packs/mit-ocw-information-entropy/source_corpus.json @@ -0,0 +1,803 @@ +{ + "course_title": "MIT OCW Information and Entropy", + "rights_note": "Derived from MIT OpenCourseWare 6.050J Information and Entropy (Spring 2008). Retain MIT OCW attribution and applicable Creative Commons terms before redistribution.", + "sources": [ + { + "source_path": "examples/ocw-information-entropy/course/course-home.md", + "source_type": "markdown", + "title": "Course Home", + "metadata": {} + }, + { + "source_path": "examples/ocw-information-entropy/course/syllabus.md", + "source_type": "markdown", + "title": "Syllabus", + "metadata": {} + }, + { + "source_path": "examples/ocw-information-entropy/course/unit-sequence.md", + "source_type": "markdown", + "title": "Unit Sequence", + "metadata": {} + } + ], + "fragments": [ + { + "fragment_id": "imported-from-markdown::mit-ocw-6.050j-information-and-entropy:-course-home::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "MIT OCW 6.050J Information and Entropy: Course Home", + "text": "Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/\nAttribution: adapted from the MIT OpenCourseWare course home page for 6.050J Information and Entropy.", + "source_refs": [ + "examples/ocw-information-entropy/course/course-home.md" + ], + "objectives": [], + "exercises": [], + "key_terms": [ + "Information", + "Entropy" + ] + }, + { + "fragment_id": "imported-from-markdown::information-and-entropy::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Information and Entropy", + "text": "- Objective: Identify the course title, instructors, departments, level, and major topical areas.\n- Exercise: Summarize the course in one paragraph for a prospective learner.\nMIT OpenCourseWare presents 6.050J Information and Entropy as a Spring 2008 undergraduate subject taught by Paul Penfield and Seth Lloyd in Electrical Engineering and Computer Science together with Mechanical Engineering. The catalog framing emphasizes theory of computation, signal processing, and mathematical reasoning about information.", + "source_refs": [ + "examples/ocw-information-entropy/course/course-home.md" + ], + "objectives": [ + "Identify the course title, instructors, departments, level, and major topical areas." + ], + "exercises": [ + "Summarize the course in one paragraph for a prospective learner." + ], + "key_terms": [ + "Information", + "Entropy", + "Paul", + "Penfield", + "Seth", + "Lloyd", + "Electrical", + "Engineering" + ] + }, + { + "fragment_id": "imported-from-markdown::information-and-entropy::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Information and Entropy", + "text": "Identify the course title, instructors, departments, level, and major topical areas.", + "source_refs": [ + "examples/ocw-information-entropy/course/course-home.md" + ] + }, + { + "fragment_id": "imported-from-markdown::information-and-entropy::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Information and Entropy", + "text": "Summarize the course in one paragraph for a prospective learner.", + "source_refs": [ + "examples/ocw-information-entropy/course/course-home.md" + ] + }, + { + "fragment_id": "imported-from-markdown::ultimate-limits-to-communication-and-computation::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Ultimate Limits to Communication and Computation", + "text": "- Objective: Explain the broad intellectual scope of the course.\n- Exercise: List the main topic clusters that connect communication, computation, and entropy.\nThe course examines the ultimate limits to communication and computation with emphasis on the physical nature of information processing. The source description highlights information and computation, digital signals, codes and compression, noise, probability, error correction, reversible and irreversible operations, physics of computation, and quantum computation. Entropy is explicitly connected both to channel capacity and to the second law of thermodynamics.", + "source_refs": [ + "examples/ocw-information-entropy/course/course-home.md" + ], + "objectives": [ + "Explain the broad intellectual scope of the course." + ], + "exercises": [ + "List the main topic clusters that connect communication, computation, and entropy." + ], + "key_terms": [ + "Entropy" + ] + }, + { + "fragment_id": "imported-from-markdown::ultimate-limits-to-communication-and-computation::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Ultimate Limits to Communication and Computation", + "text": "Explain the broad intellectual scope of the course.", + "source_refs": [ + "examples/ocw-information-entropy/course/course-home.md" + ] + }, + { + "fragment_id": "imported-from-markdown::ultimate-limits-to-communication-and-computation::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Ultimate Limits to Communication and Computation", + "text": "List the main topic clusters that connect communication, computation, and entropy.", + "source_refs": [ + "examples/ocw-information-entropy/course/course-home.md" + ] + }, + { + "fragment_id": "imported-from-markdown::open-textbooks,-problem-sets,-and-programming-work::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Open Textbooks, Problem Sets, and Programming Work", + "text": "- Objective: Identify the main kinds of learning resources supplied through the course.\n- Exercise: Explain how these resource types support both conceptual study and practice.\nThe course home lists open textbooks, problem sets, problem set solutions, and programming assignments. A learner using Didactopus should treat these as complementary evidence sources rather than relying on one summary alone.", + "source_refs": [ + "examples/ocw-information-entropy/course/course-home.md" + ], + "objectives": [ + "Identify the main kinds of learning resources supplied through the course." + ], + "exercises": [ + "Explain how these resource types support both conceptual study and practice." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::open-textbooks,-problem-sets,-and-programming-work::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Open Textbooks, Problem Sets, and Programming Work", + "text": "Identify the main kinds of learning resources supplied through the course.", + "source_refs": [ + "examples/ocw-information-entropy/course/course-home.md" + ] + }, + { + "fragment_id": "imported-from-markdown::open-textbooks,-problem-sets,-and-programming-work::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Open Textbooks, Problem Sets, and Programming Work", + "text": "Explain how these resource types support both conceptual study and practice.", + "source_refs": [ + "examples/ocw-information-entropy/course/course-home.md" + ] + }, + { + "fragment_id": "imported-from-markdown::mit-ocw-6.050j-information-and-entropy:-syllabus::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "MIT OCW 6.050J Information and Entropy: Syllabus", + "text": "Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/\nAttribution: adapted from the MIT OpenCourseWare syllabus page for 6.050J Information and Entropy.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ], + "objectives": [], + "exercises": [], + "key_terms": [ + "Information", + "Entropy" + ] + }, + { + "fragment_id": "imported-from-markdown::prerequisites-and-mathematical-background::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Prerequisites and Mathematical Background", + "text": "- Objective: Explain the mathematical maturity expected by the course.\n- Exercise: Decide whether a learner needs review in probability, linear algebra, or signals before beginning.\nThe syllabus expects a foundation comparable to MIT subjects in calculus and linear algebra, together with comfort in probability, signals, and basic programming. Didactopus should therefore surface prerequisite review when those foundations appear weak.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ], + "objectives": [ + "Explain the mathematical maturity expected by the course." + ], + "exercises": [ + "Decide whether a learner needs review in probability, linear algebra, or signals before beginning." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::prerequisites-and-mathematical-background::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Prerequisites and Mathematical Background", + "text": "Explain the mathematical maturity expected by the course.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ] + }, + { + "fragment_id": "imported-from-markdown::prerequisites-and-mathematical-background::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Prerequisites and Mathematical Background", + "text": "Decide whether a learner needs review in probability, linear algebra, or signals before beginning.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ] + }, + { + "fragment_id": "imported-from-markdown::assessment-structure::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Assessment Structure", + "text": "- Objective: Identify the role of problem sets, exams, and programming work in the course.\n- Exercise: Build a study schedule that alternates reading, derivation, and worked exercises.\nThe syllabus emphasizes regular problem solving and quantitative reasoning. The course is not only a reading list: learners are expected to derive results, solve structured problems, and connect abstract arguments to implementation-oriented tasks.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ], + "objectives": [ + "Identify the role of problem sets, exams, and programming work in the course." + ], + "exercises": [ + "Build a study schedule that alternates reading, derivation, and worked exercises." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::assessment-structure::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Assessment Structure", + "text": "Identify the role of problem sets, exams, and programming work in the course.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ] + }, + { + "fragment_id": "imported-from-markdown::assessment-structure::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Assessment Structure", + "text": "Build a study schedule that alternates reading, derivation, and worked exercises.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ] + }, + { + "fragment_id": "imported-from-markdown::course-notes-and-reference-texts::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Course Notes and Reference Texts", + "text": "- Objective: Explain how the course notes and textbook references supply the core conceptual sequence.\n- Exercise: Compare when to use course notes versus outside references for clarification.\nMIT OCW links course notes and textbook-style resources through the syllabus and resource pages. The intended use is cumulative: earlier notes establish counting, probability, and entropy, while later materials expand into coding, noise, secrecy, thermodynamics, and computation.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ], + "objectives": [ + "Explain how the course notes and textbook references supply the core conceptual sequence." + ], + "exercises": [ + "Compare when to use course notes versus outside references for clarification." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::course-notes-and-reference-texts::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Course Notes and Reference Texts", + "text": "Explain how the course notes and textbook references supply the core conceptual sequence.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ] + }, + { + "fragment_id": "imported-from-markdown::course-notes-and-reference-texts::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Course Notes and Reference Texts", + "text": "Compare when to use course notes versus outside references for clarification.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ] + }, + { + "fragment_id": "imported-from-markdown::independent-reasoning-and-careful-comparison::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Independent Reasoning and Careful Comparison", + "text": "- Objective: Explain why the course requires precise comparison of related but non-identical concepts.\n- Exercise: Write a short note distinguishing Shannon entropy, channel capacity, and thermodynamic entropy.\nThe syllabus framing implies a style of work where analogy is useful but dangerous when used loosely. Learners must compare models carefully, state assumptions, and notice where similar mathematics does not imply identical interpretation.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ], + "objectives": [ + "Explain why the course requires precise comparison of related but non-identical concepts." + ], + "exercises": [ + "Write a short note distinguishing Shannon entropy, channel capacity, and thermodynamic entropy." + ], + "key_terms": [ + "Shannon", + "Learners" + ] + }, + { + "fragment_id": "imported-from-markdown::independent-reasoning-and-careful-comparison::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Independent Reasoning and Careful Comparison", + "text": "Explain why the course requires precise comparison of related but non-identical concepts.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ] + }, + { + "fragment_id": "imported-from-markdown::independent-reasoning-and-careful-comparison::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Independent Reasoning and Careful Comparison", + "text": "Write a short note distinguishing Shannon entropy, channel capacity, and thermodynamic entropy.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ] + }, + { + "fragment_id": "imported-from-markdown::mit-ocw-6.050j-information-and-entropy:-unit-sequence::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "MIT OCW 6.050J Information and Entropy: Unit Sequence", + "text": "Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/\nAttribution: adapted from the MIT OpenCourseWare unit progression and resource organization for 6.050J Information and Entropy.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [], + "exercises": [], + "key_terms": [ + "Information", + "Entropy" + ] + }, + { + "fragment_id": "imported-from-markdown::counting-and-probability::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Counting and Probability", + "text": "- Objective: Explain how counting arguments, probability spaces, and random variables support later information-theory results.\n- Exercise: Derive a simple counting argument for binary strings and compute an event probability.\nEarly units establish counting, combinatorics, and probability as the language used to reason about uncertainty, messages, and evidence.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain how counting arguments, probability spaces, and random variables support later information-theory results." + ], + "exercises": [ + "Derive a simple counting argument for binary strings and compute an event probability." + ], + "key_terms": [ + "Derive" + ] + }, + { + "fragment_id": "imported-from-markdown::counting-and-probability::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Counting and Probability", + "text": "Explain how counting arguments, probability spaces, and random variables support later information-theory results.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::counting-and-probability::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Counting and Probability", + "text": "Derive a simple counting argument for binary strings and compute an event probability.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::shannon-entropy::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Shannon Entropy", + "text": "- Objective: Explain Shannon entropy as a measure of uncertainty and compare high-entropy and low-entropy sources.\n- Exercise: Compute the entropy of a Bernoulli source and interpret the result.\nThe course then introduces entropy as a quantitative measure of uncertainty for a source model and uses it to reason about representation cost and surprise.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain Shannon entropy as a measure of uncertainty and compare high-entropy and low-entropy sources." + ], + "exercises": [ + "Compute the entropy of a Bernoulli source and interpret the result." + ], + "key_terms": [ + "Shannon", + "Bernoulli" + ] + }, + { + "fragment_id": "imported-from-markdown::shannon-entropy::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Shannon Entropy", + "text": "Explain Shannon entropy as a measure of uncertainty and compare high-entropy and low-entropy sources.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::shannon-entropy::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Shannon Entropy", + "text": "Compute the entropy of a Bernoulli source and interpret the result.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::mutual-information::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Mutual Information", + "text": "- Objective: Explain mutual information and relate it to dependence between signals or observations.\n- Exercise: Compare independent variables with dependent variables using mutual-information reasoning.\nThese units ask the learner to understand how observation changes uncertainty and what it means for one variable to carry information about another.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain mutual information and relate it to dependence between signals or observations." + ], + "exercises": [ + "Compare independent variables with dependent variables using mutual-information reasoning." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::mutual-information::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Mutual Information", + "text": "Explain mutual information and relate it to dependence between signals or observations.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::mutual-information::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Mutual Information", + "text": "Compare independent variables with dependent variables using mutual-information reasoning.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::source-coding-and-compression::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Source Coding and Compression", + "text": "- Objective: Explain lossless compression in terms of entropy, redundancy, and coding choices.\n- Exercise: Describe when compression succeeds and when it fails on already-random data.\nThe course develops the idea that structured sources can often be described more efficiently, but only up to limits implied by entropy.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain lossless compression in terms of entropy, redundancy, and coding choices." + ], + "exercises": [ + "Describe when compression succeeds and when it fails on already-random data." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::source-coding-and-compression::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Source Coding and Compression", + "text": "Explain lossless compression in terms of entropy, redundancy, and coding choices.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::source-coding-and-compression::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Source Coding and Compression", + "text": "Describe when compression succeeds and when it fails on already-random data.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::huffman-coding::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Huffman Coding", + "text": "- Objective: Explain Huffman coding and justify why likely symbols receive shorter descriptions.\n- Exercise: Build a Huffman code for a small source alphabet.\nLearners use trees and expected length arguments to connect probability models to practical code design.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain Huffman coding and justify why likely symbols receive shorter descriptions." + ], + "exercises": [ + "Build a Huffman code for a small source alphabet." + ], + "key_terms": [ + "Huffman", + "Learners" + ] + }, + { + "fragment_id": "imported-from-markdown::huffman-coding::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Huffman Coding", + "text": "Explain Huffman coding and justify why likely symbols receive shorter descriptions.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::huffman-coding::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Huffman Coding", + "text": "Build a Huffman code for a small source alphabet.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::channel-capacity::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Channel Capacity", + "text": "- Objective: Explain channel capacity as a limit on reliable communication over a noisy channel.\n- Exercise: State why reliable transmission above capacity is impossible in the long run.\nThe course treats capacity as a fundamental upper bound and frames noisy communication in terms of rates, inference, and uncertainty reduction.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain channel capacity as a limit on reliable communication over a noisy channel." + ], + "exercises": [ + "State why reliable transmission above capacity is impossible in the long run." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::channel-capacity::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Channel Capacity", + "text": "Explain channel capacity as a limit on reliable communication over a noisy channel.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::channel-capacity::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Channel Capacity", + "text": "State why reliable transmission above capacity is impossible in the long run.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::channel-coding::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Channel Coding", + "text": "- Objective: Explain how channel coding adds redundancy to protect messages from noise.\n- Exercise: Contrast uncoded transmission with coded transmission on a noisy channel.\nThese units emphasize that redundancy can be wasteful in compression but essential in communication under uncertainty.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain how channel coding adds redundancy to protect messages from noise." + ], + "exercises": [ + "Contrast uncoded transmission with coded transmission on a noisy channel." + ], + "key_terms": [ + "Contrast" + ] + }, + { + "fragment_id": "imported-from-markdown::channel-coding::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Channel Coding", + "text": "Explain how channel coding adds redundancy to protect messages from noise.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::channel-coding::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Channel Coding", + "text": "Contrast uncoded transmission with coded transmission on a noisy channel.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::error-correcting-codes::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Error Correcting Codes", + "text": "- Objective: Explain how error-correcting codes detect or repair corrupted symbols.\n- Exercise: Describe a simple parity-style code and its limits.\nThe learner must connect abstract limits to concrete coding mechanisms and understand both strengths and failure modes.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain how error-correcting codes detect or repair corrupted symbols." + ], + "exercises": [ + "Describe a simple parity-style code and its limits." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::error-correcting-codes::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Error Correcting Codes", + "text": "Explain how error-correcting codes detect or repair corrupted symbols.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::error-correcting-codes::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Error Correcting Codes", + "text": "Describe a simple parity-style code and its limits.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::cryptography-and-information-hiding::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Cryptography and Information Hiding", + "text": "- Objective: Explain the relationship between secrecy, information leakage, and coded communication.\n- Exercise: Compare a secure scheme with a weak one in terms of revealed information.\nThe course extends information-theoretic reasoning to adversarial settings where controlling what an observer can infer becomes central.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain the relationship between secrecy, information leakage, and coded communication." + ], + "exercises": [ + "Compare a secure scheme with a weak one in terms of revealed information." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::cryptography-and-information-hiding::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Cryptography and Information Hiding", + "text": "Explain the relationship between secrecy, information leakage, and coded communication.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::cryptography-and-information-hiding::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Cryptography and Information Hiding", + "text": "Compare a secure scheme with a weak one in terms of revealed information.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::thermodynamics-and-entropy::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Thermodynamics and Entropy", + "text": "- Objective: Explain how thermodynamic entropy relates to, and differs from, Shannon entropy.\n- Exercise: Compare the two entropy notions and identify what is preserved across the analogy.\nThe course uses entropy as a bridge concept between communication theory and physics while insisting on careful interpretation.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain how thermodynamic entropy relates to, and differs from, Shannon entropy." + ], + "exercises": [ + "Compare the two entropy notions and identify what is preserved across the analogy." + ], + "key_terms": [ + "Shannon" + ] + }, + { + "fragment_id": "imported-from-markdown::thermodynamics-and-entropy::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Thermodynamics and Entropy", + "text": "Explain how thermodynamic entropy relates to, and differs from, Shannon entropy.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::thermodynamics-and-entropy::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Thermodynamics and Entropy", + "text": "Compare the two entropy notions and identify what is preserved across the analogy.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::reversible-computation-and-quantum-computation::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Reversible Computation and Quantum Computation", + "text": "- Objective: Explain why the physical implementation of computation matters for information processing limits.\n- Exercise: Summarize how reversible computation changes the discussion of dissipation and information loss.\nLater units connect information, entropy, and computation more directly by considering reversible logic, irreversibility, and quantum information themes.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain why the physical implementation of computation matters for information processing limits." + ], + "exercises": [ + "Summarize how reversible computation changes the discussion of dissipation and information loss." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::reversible-computation-and-quantum-computation::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Reversible Computation and Quantum Computation", + "text": "Explain why the physical implementation of computation matters for information processing limits.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::reversible-computation-and-quantum-computation::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Reversible Computation and Quantum Computation", + "text": "Summarize how reversible computation changes the discussion of dissipation and information loss.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::course-synthesis::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Course Synthesis", + "text": "- Objective: Synthesize the course by connecting entropy, coding, reliability, secrecy, and physical interpretation in one coherent narrative.\n- Exercise: Produce a final study guide that links source coding, channel coding, secrecy, thermodynamic analogies, and computation.\nThe end of the course asks the learner to unify the mathematical and physical perspectives rather than treating the units as disconnected topics.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Synthesize the course by connecting entropy, coding, reliability, secrecy, and physical interpretation in one coherent narrative." + ], + "exercises": [ + "Produce a final study guide that links source coding, channel coding, secrecy, thermodynamic analogies, and computation." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::course-synthesis::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Course Synthesis", + "text": "Synthesize the course by connecting entropy, coding, reliability, secrecy, and physical interpretation in one coherent narrative.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::course-synthesis::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Course Synthesis", + "text": "Produce a final study guide that links source coding, channel coding, secrecy, thermodynamic analogies, and computation.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + } + ] +} \ No newline at end of file diff --git a/domain-packs/mit-ocw-information-entropy/source_inventory.yaml b/domain-packs/mit-ocw-information-entropy/source_inventory.yaml index 8423411..adaa0e4 100644 --- a/domain-packs/mit-ocw-information-entropy/source_inventory.yaml +++ b/domain-packs/mit-ocw-information-entropy/source_inventory.yaml @@ -6,7 +6,20 @@ sources: creator: MIT OpenCourseWare license_id: CC BY-NC-SA 4.0 license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/ - retrieved_at: "2026-03-14" + retrieved_at: "2026-03-16" + adapted: true + attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0. + excluded_from_upstream_license: false + exclusion_notes: "" + + - source_id: mit-ocw-6-050j-syllabus + title: MIT OpenCourseWare 6.050J Information and Entropy syllabus + url: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/ + publisher: Massachusetts Institute of Technology + creator: MIT OpenCourseWare + license_id: CC BY-NC-SA 4.0 + license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/ + retrieved_at: "2026-03-16" adapted: true attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0. excluded_from_upstream_license: false @@ -19,7 +32,7 @@ sources: creator: MIT OpenCourseWare license_id: CC BY-NC-SA 4.0 license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/ - retrieved_at: "2026-03-14" + retrieved_at: "2026-03-16" adapted: true attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0. excluded_from_upstream_license: false @@ -32,7 +45,7 @@ sources: creator: MIT OpenCourseWare license_id: CC BY-NC-SA 4.0 license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/ - retrieved_at: "2026-03-14" + retrieved_at: "2026-03-16" adapted: true attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0. excluded_from_upstream_license: false diff --git a/examples/ocw-information-entropy-run/artifact_manifest.json b/examples/ocw-information-entropy-run/artifact_manifest.json index 07fe9c9..dda9aab 100644 --- a/examples/ocw-information-entropy-run/artifact_manifest.json +++ b/examples/ocw-information-entropy-run/artifact_manifest.json @@ -3,9 +3,54 @@ "domain": "MIT OCW Information and Entropy", "artifacts": [ { - "concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy", + "concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home", "artifact_type": "symbolic", - "artifact_name": "mit-ocw-6-050j-information-and-entropy.md" + "artifact_name": "mit-ocw-6-050j-information-and-entropy-course-home.md" + }, + { + "concept": "mit-ocw-information-and-entropy::information-and-entropy", + "artifact_type": "symbolic", + "artifact_name": "information-and-entropy.md" + }, + { + "concept": "mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation", + "artifact_type": "symbolic", + "artifact_name": "ultimate-limits-to-communication-and-computation.md" + }, + { + "concept": "mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work", + "artifact_type": "symbolic", + "artifact_name": "open-textbooks-problem-sets-and-programming-work.md" + }, + { + "concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus", + "artifact_type": "symbolic", + "artifact_name": "mit-ocw-6-050j-information-and-entropy-syllabus.md" + }, + { + "concept": "mit-ocw-information-and-entropy::prerequisites-and-mathematical-background", + "artifact_type": "symbolic", + "artifact_name": "prerequisites-and-mathematical-background.md" + }, + { + "concept": "mit-ocw-information-and-entropy::assessment-structure", + "artifact_type": "symbolic", + "artifact_name": "assessment-structure.md" + }, + { + "concept": "mit-ocw-information-and-entropy::course-notes-and-reference-texts", + "artifact_type": "symbolic", + "artifact_name": "course-notes-and-reference-texts.md" + }, + { + "concept": "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison", + "artifact_type": "symbolic", + "artifact_name": "independent-reasoning-and-careful-comparison.md" + }, + { + "concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence", + "artifact_type": "symbolic", + "artifact_name": "mit-ocw-6-050j-information-and-entropy-unit-sequence.md" }, { "concept": "mit-ocw-information-and-entropy::counting-and-probability", @@ -23,9 +68,9 @@ "artifact_name": "mutual-information.md" }, { - "concept": "mit-ocw-information-and-entropy::data-compression", + "concept": "mit-ocw-information-and-entropy::source-coding-and-compression", "artifact_type": "symbolic", - "artifact_name": "data-compression.md" + "artifact_name": "source-coding-and-compression.md" }, { "concept": "mit-ocw-information-and-entropy::huffman-coding", diff --git a/examples/ocw-information-entropy-run/capability_profile.json b/examples/ocw-information-entropy-run/capability_profile.json index 1bf670d..461588e 100644 --- a/examples/ocw-information-entropy-run/capability_profile.json +++ b/examples/ocw-information-entropy-run/capability_profile.json @@ -3,24 +3,42 @@ "display_name": "OCW Information Entropy Agent", "domain": "MIT OCW Information and Entropy", "mastered_concepts": [ + "mit-ocw-information-and-entropy::assessment-structure", "mit-ocw-information-and-entropy::channel-capacity", "mit-ocw-information-and-entropy::channel-coding", "mit-ocw-information-and-entropy::counting-and-probability", + "mit-ocw-information-and-entropy::course-notes-and-reference-texts", "mit-ocw-information-and-entropy::cryptography-and-information-hiding", - "mit-ocw-information-and-entropy::data-compression", "mit-ocw-information-and-entropy::error-correcting-codes", "mit-ocw-information-and-entropy::huffman-coding", - "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy", + "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison", + "mit-ocw-information-and-entropy::information-and-entropy", + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home", + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus", + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence", "mit-ocw-information-and-entropy::mutual-information", + "mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work", + "mit-ocw-information-and-entropy::prerequisites-and-mathematical-background", "mit-ocw-information-and-entropy::shannon-entropy", - "mit-ocw-information-and-entropy::thermodynamics-and-entropy" + "mit-ocw-information-and-entropy::source-coding-and-compression", + "mit-ocw-information-and-entropy::thermodynamics-and-entropy", + "mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation" ], "weak_dimensions_by_concept": { - "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy": [], + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home": [], + "mit-ocw-information-and-entropy::information-and-entropy": [], + "mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation": [], + "mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work": [], + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus": [], + "mit-ocw-information-and-entropy::prerequisites-and-mathematical-background": [], + "mit-ocw-information-and-entropy::assessment-structure": [], + "mit-ocw-information-and-entropy::course-notes-and-reference-texts": [], + "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison": [], + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence": [], "mit-ocw-information-and-entropy::counting-and-probability": [], "mit-ocw-information-and-entropy::shannon-entropy": [], "mit-ocw-information-and-entropy::mutual-information": [], - "mit-ocw-information-and-entropy::data-compression": [], + "mit-ocw-information-and-entropy::source-coding-and-compression": [], "mit-ocw-information-and-entropy::huffman-coding": [], "mit-ocw-information-and-entropy::channel-capacity": [], "mit-ocw-information-and-entropy::channel-coding": [], @@ -29,7 +47,52 @@ "mit-ocw-information-and-entropy::thermodynamics-and-entropy": [] }, "evaluator_summary_by_concept": { - "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy": { + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home": { + "correctness": 0.8400000000000001, + "explanation": 0.85, + "critique": 0.7999999999999999 + }, + "mit-ocw-information-and-entropy::information-and-entropy": { + "correctness": 0.8400000000000001, + "explanation": 0.85, + "critique": 0.7999999999999999 + }, + "mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation": { + "correctness": 0.8400000000000001, + "explanation": 0.85, + "critique": 0.7999999999999999 + }, + "mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work": { + "correctness": 0.8400000000000001, + "explanation": 0.85, + "critique": 0.7999999999999999 + }, + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus": { + "correctness": 0.8400000000000001, + "explanation": 0.85, + "critique": 0.7999999999999999 + }, + "mit-ocw-information-and-entropy::prerequisites-and-mathematical-background": { + "correctness": 0.8400000000000001, + "explanation": 0.85, + "critique": 0.7999999999999999 + }, + "mit-ocw-information-and-entropy::assessment-structure": { + "correctness": 0.8400000000000001, + "explanation": 0.85, + "critique": 0.7999999999999999 + }, + "mit-ocw-information-and-entropy::course-notes-and-reference-texts": { + "correctness": 0.8400000000000001, + "explanation": 0.85, + "critique": 0.7999999999999999 + }, + "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison": { + "correctness": 0.8400000000000001, + "explanation": 0.85, + "critique": 0.7999999999999999 + }, + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence": { "correctness": 0.8400000000000001, "explanation": 0.85, "critique": 0.7999999999999999 @@ -49,7 +112,7 @@ "explanation": 0.85, "critique": 0.7999999999999999 }, - "mit-ocw-information-and-entropy::data-compression": { + "mit-ocw-information-and-entropy::source-coding-and-compression": { "correctness": 0.8400000000000001, "explanation": 0.85, "critique": 0.7999999999999999 @@ -87,9 +150,54 @@ }, "artifacts": [ { - "concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy", + "concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home", "artifact_type": "symbolic", - "artifact_name": "mit-ocw-6-050j-information-and-entropy.md" + "artifact_name": "mit-ocw-6-050j-information-and-entropy-course-home.md" + }, + { + "concept": "mit-ocw-information-and-entropy::information-and-entropy", + "artifact_type": "symbolic", + "artifact_name": "information-and-entropy.md" + }, + { + "concept": "mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation", + "artifact_type": "symbolic", + "artifact_name": "ultimate-limits-to-communication-and-computation.md" + }, + { + "concept": "mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work", + "artifact_type": "symbolic", + "artifact_name": "open-textbooks-problem-sets-and-programming-work.md" + }, + { + "concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus", + "artifact_type": "symbolic", + "artifact_name": "mit-ocw-6-050j-information-and-entropy-syllabus.md" + }, + { + "concept": "mit-ocw-information-and-entropy::prerequisites-and-mathematical-background", + "artifact_type": "symbolic", + "artifact_name": "prerequisites-and-mathematical-background.md" + }, + { + "concept": "mit-ocw-information-and-entropy::assessment-structure", + "artifact_type": "symbolic", + "artifact_name": "assessment-structure.md" + }, + { + "concept": "mit-ocw-information-and-entropy::course-notes-and-reference-texts", + "artifact_type": "symbolic", + "artifact_name": "course-notes-and-reference-texts.md" + }, + { + "concept": "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison", + "artifact_type": "symbolic", + "artifact_name": "independent-reasoning-and-careful-comparison.md" + }, + { + "concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence", + "artifact_type": "symbolic", + "artifact_name": "mit-ocw-6-050j-information-and-entropy-unit-sequence.md" }, { "concept": "mit-ocw-information-and-entropy::counting-and-probability", @@ -107,9 +215,9 @@ "artifact_name": "mutual-information.md" }, { - "concept": "mit-ocw-information-and-entropy::data-compression", + "concept": "mit-ocw-information-and-entropy::source-coding-and-compression", "artifact_type": "symbolic", - "artifact_name": "data-compression.md" + "artifact_name": "source-coding-and-compression.md" }, { "concept": "mit-ocw-information-and-entropy::huffman-coding", diff --git a/examples/ocw-information-entropy-run/capability_report.md b/examples/ocw-information-entropy-run/capability_report.md index 9c65e6a..ee56f97 100644 --- a/examples/ocw-information-entropy-run/capability_report.md +++ b/examples/ocw-information-entropy-run/capability_report.md @@ -4,19 +4,34 @@ - Domain: `MIT OCW Information and Entropy` ## Mastered Concepts +- mit-ocw-information-and-entropy::assessment-structure - mit-ocw-information-and-entropy::channel-capacity - mit-ocw-information-and-entropy::channel-coding - mit-ocw-information-and-entropy::counting-and-probability +- mit-ocw-information-and-entropy::course-notes-and-reference-texts - mit-ocw-information-and-entropy::cryptography-and-information-hiding -- mit-ocw-information-and-entropy::data-compression - mit-ocw-information-and-entropy::error-correcting-codes - mit-ocw-information-and-entropy::huffman-coding -- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy +- mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison +- mit-ocw-information-and-entropy::information-and-entropy +- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home +- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus +- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence - mit-ocw-information-and-entropy::mutual-information +- mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work +- mit-ocw-information-and-entropy::prerequisites-and-mathematical-background - mit-ocw-information-and-entropy::shannon-entropy +- mit-ocw-information-and-entropy::source-coding-and-compression - mit-ocw-information-and-entropy::thermodynamics-and-entropy +- mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation ## Concept Summaries +### mit-ocw-information-and-entropy::assessment-structure +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + ### mit-ocw-information-and-entropy::channel-capacity - correctness: 0.84 - critique: 0.80 @@ -35,13 +50,13 @@ - explanation: 0.85 - weak dimensions: none -### mit-ocw-information-and-entropy::cryptography-and-information-hiding +### mit-ocw-information-and-entropy::course-notes-and-reference-texts - correctness: 0.84 - critique: 0.80 - explanation: 0.85 - weak dimensions: none -### mit-ocw-information-and-entropy::data-compression +### mit-ocw-information-and-entropy::cryptography-and-information-hiding - correctness: 0.84 - critique: 0.80 - explanation: 0.85 @@ -59,7 +74,31 @@ - explanation: 0.85 - weak dimensions: none -### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy +### mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + +### mit-ocw-information-and-entropy::information-and-entropy +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + +### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + +### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + +### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence - correctness: 0.84 - critique: 0.80 - explanation: 0.85 @@ -71,24 +110,57 @@ - explanation: 0.85 - weak dimensions: none +### mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + +### mit-ocw-information-and-entropy::prerequisites-and-mathematical-background +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + ### mit-ocw-information-and-entropy::shannon-entropy - correctness: 0.84 - critique: 0.80 - explanation: 0.85 - weak dimensions: none +### mit-ocw-information-and-entropy::source-coding-and-compression +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + ### mit-ocw-information-and-entropy::thermodynamics-and-entropy - correctness: 0.84 - critique: 0.80 - explanation: 0.85 - weak dimensions: none +### mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + ## Artifacts -- mit-ocw-6-050j-information-and-entropy.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy +- mit-ocw-6-050j-information-and-entropy-course-home.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home +- information-and-entropy.md (symbolic) for mit-ocw-information-and-entropy::information-and-entropy +- ultimate-limits-to-communication-and-computation.md (symbolic) for mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation +- open-textbooks-problem-sets-and-programming-work.md (symbolic) for mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work +- mit-ocw-6-050j-information-and-entropy-syllabus.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus +- prerequisites-and-mathematical-background.md (symbolic) for mit-ocw-information-and-entropy::prerequisites-and-mathematical-background +- assessment-structure.md (symbolic) for mit-ocw-information-and-entropy::assessment-structure +- course-notes-and-reference-texts.md (symbolic) for mit-ocw-information-and-entropy::course-notes-and-reference-texts +- independent-reasoning-and-careful-comparison.md (symbolic) for mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison +- mit-ocw-6-050j-information-and-entropy-unit-sequence.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence - counting-and-probability.md (symbolic) for mit-ocw-information-and-entropy::counting-and-probability - shannon-entropy.md (symbolic) for mit-ocw-information-and-entropy::shannon-entropy - mutual-information.md (symbolic) for mit-ocw-information-and-entropy::mutual-information -- data-compression.md (symbolic) for mit-ocw-information-and-entropy::data-compression +- source-coding-and-compression.md (symbolic) for mit-ocw-information-and-entropy::source-coding-and-compression - huffman-coding.md (symbolic) for mit-ocw-information-and-entropy::huffman-coding - channel-capacity.md (symbolic) for mit-ocw-information-and-entropy::channel-capacity - channel-coding.md (symbolic) for mit-ocw-information-and-entropy::channel-coding diff --git a/examples/ocw-information-entropy-run/run_summary.json b/examples/ocw-information-entropy-run/run_summary.json index 0e67d87..a5d9609 100644 --- a/examples/ocw-information-entropy-run/run_summary.json +++ b/examples/ocw-information-entropy-run/run_summary.json @@ -1,75 +1,32 @@ { - "course_source": "/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy/6-050j-information-and-entropy.md", - "pack_dir": "/home/netuser/dev/Didactopustry1/domain-packs/mit-ocw-information-entropy", - "skill_dir": "/home/netuser/dev/Didactopustry1/skills/ocw-information-entropy-agent", - "source_inventory": "/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy/sources.yaml", + "course_source": "examples/ocw-information-entropy/course", + "source_document_count": 3, + "pack_dir": "domain-packs/mit-ocw-information-entropy", + "skill_dir": "skills/ocw-information-entropy-agent", + "source_inventory": "examples/ocw-information-entropy/sources.yaml", "review_flags": [ - "Module 'Imported from MARKDOWN' has no explicit exercises; mastery signals may be weak.", - "Concept 'MIT OCW 6.050J Information and Entropy' has no extracted mastery signals; review manually.", - "Concept 'Information' has no extracted mastery signals; review manually.", - "Concept 'Entropy' has no extracted mastery signals; review manually.", - "Concept 'Source' has no extracted mastery signals; review manually.", - "Concept 'OpenCourseWare' has no extracted mastery signals; review manually.", - "Concept 'Spring' has no extracted mastery signals; review manually.", - "Concept 'Attribution' has no extracted mastery signals; review manually.", - "Concept 'Counting and Probability' has no extracted mastery signals; review manually.", - "Concept 'Counting' has no extracted mastery signals; review manually.", - "Concept 'Probability' has no extracted mastery signals; review manually.", - "Concept 'Objective' has no extracted mastery signals; review manually.", - "Concept 'Explain' has no extracted mastery signals; review manually.", - "Concept 'Exercise' has no extracted mastery signals; review manually.", - "Concept 'Derive' has no extracted mastery signals; review manually.", - "Concept 'This' has no extracted mastery signals; review manually.", - "Concept 'Random' has no extracted mastery signals; review manually.", - "Concept 'Shannon Entropy' has no extracted mastery signals; review manually.", - "Concept 'Shannon' has no extracted mastery signals; review manually.", - "Concept 'Compute' has no extracted mastery signals; review manually.", - "Concept 'Bernoulli' has no extracted mastery signals; review manually.", - "Concept 'Mutual Information' has no extracted mastery signals; review manually.", - "Concept 'Mutual' has no extracted mastery signals; review manually.", - "Concept 'Compare' has no extracted mastery signals; review manually.", - "Concept 'Dependence' has no extracted mastery signals; review manually.", - "Concept 'Data Compression' has no extracted mastery signals; review manually.", - "Concept 'Data' has no extracted mastery signals; review manually.", - "Concept 'Compression' has no extracted mastery signals; review manually.", - "Concept 'Describe' has no extracted mastery signals; review manually.", - "Concept 'Redundancy' has no extracted mastery signals; review manually.", - "Concept 'Huffman Coding' has no extracted mastery signals; review manually.", - "Concept 'Huffman' has no extracted mastery signals; review manually.", - "Concept 'Coding' has no extracted mastery signals; review manually.", - "Concept 'Build' has no extracted mastery signals; review manually.", - "Concept 'Prefix' has no extracted mastery signals; review manually.", - "Concept 'Channel Capacity' has no extracted mastery signals; review manually.", - "Concept 'Channel' has no extracted mastery signals; review manually.", - "Concept 'Capacity' has no extracted mastery signals; review manually.", - "Concept 'State' has no extracted mastery signals; review manually.", - "Concept 'Reliable' has no extracted mastery signals; review manually.", - "Concept 'Channel Coding' has no extracted mastery signals; review manually.", - "Concept 'Contrast' has no extracted mastery signals; review manually.", - "Concept 'Decoding' has no extracted mastery signals; review manually.", - "Concept 'Error Correcting Codes' has no extracted mastery signals; review manually.", - "Concept 'Error' has no extracted mastery signals; review manually.", - "Concept 'Correcting' has no extracted mastery signals; review manually.", - "Concept 'Codes' has no extracted mastery signals; review manually.", - "Concept 'Cryptography and Information Hiding' has no extracted mastery signals; review manually.", - "Concept 'Cryptography' has no extracted mastery signals; review manually.", - "Concept 'Hiding' has no extracted mastery signals; review manually.", - "Concept 'Thermodynamics and Entropy' has no extracted mastery signals; review manually.", - "Concept 'Thermodynamics' has no extracted mastery signals; review manually.", - "Concept 'Course Synthesis' has no extracted mastery signals; review manually.", - "Concept 'Course' has no extracted mastery signals; review manually.", - "Concept 'Synthesis' has no extracted mastery signals; review manually.", - "Concept 'Synthesize' has no extracted mastery signals; review manually.", - "Concept 'Produce' has no extracted mastery signals; review manually." + "Concept 'MIT OCW 6.050J Information and Entropy: Course Home' has no extracted mastery signals; review manually.", + "Concept 'MIT OCW 6.050J Information and Entropy: Syllabus' has no extracted mastery signals; review manually.", + "Concept 'MIT OCW 6.050J Information and Entropy: Unit Sequence' has no extracted mastery signals; review manually." ], - "concept_count": 56, + "concept_count": 34, + "source_fragment_count": 60, "target_concept": "mit-ocw-information-and-entropy::thermodynamics-and-entropy", "curriculum_path": [ - "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy", + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home", + "mit-ocw-information-and-entropy::information-and-entropy", + "mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation", + "mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work", + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus", + "mit-ocw-information-and-entropy::prerequisites-and-mathematical-background", + "mit-ocw-information-and-entropy::assessment-structure", + "mit-ocw-information-and-entropy::course-notes-and-reference-texts", + "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison", + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence", "mit-ocw-information-and-entropy::counting-and-probability", "mit-ocw-information-and-entropy::shannon-entropy", "mit-ocw-information-and-entropy::mutual-information", - "mit-ocw-information-and-entropy::data-compression", + "mit-ocw-information-and-entropy::source-coding-and-compression", "mit-ocw-information-and-entropy::huffman-coding", "mit-ocw-information-and-entropy::channel-capacity", "mit-ocw-information-and-entropy::channel-coding", @@ -78,25 +35,35 @@ "mit-ocw-information-and-entropy::thermodynamics-and-entropy" ], "mastered_concepts": [ + "mit-ocw-information-and-entropy::assessment-structure", "mit-ocw-information-and-entropy::channel-capacity", "mit-ocw-information-and-entropy::channel-coding", "mit-ocw-information-and-entropy::counting-and-probability", + "mit-ocw-information-and-entropy::course-notes-and-reference-texts", "mit-ocw-information-and-entropy::cryptography-and-information-hiding", - "mit-ocw-information-and-entropy::data-compression", "mit-ocw-information-and-entropy::error-correcting-codes", "mit-ocw-information-and-entropy::huffman-coding", - "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy", + "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison", + "mit-ocw-information-and-entropy::information-and-entropy", + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home", + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus", + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence", "mit-ocw-information-and-entropy::mutual-information", + "mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work", + "mit-ocw-information-and-entropy::prerequisites-and-mathematical-background", "mit-ocw-information-and-entropy::shannon-entropy", - "mit-ocw-information-and-entropy::thermodynamics-and-entropy" + "mit-ocw-information-and-entropy::source-coding-and-compression", + "mit-ocw-information-and-entropy::thermodynamics-and-entropy", + "mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation" ], - "artifact_count": 11, - "compliance_manifest": "/home/netuser/dev/Didactopustry1/domain-packs/mit-ocw-information-entropy/pack_compliance_manifest.json", + "artifact_count": 20, + "compliance_manifest": "domain-packs/mit-ocw-information-entropy/pack_compliance_manifest.json", "compliance": { "pack_id": "mit-ocw-information-and-entropy", "display_name": "MIT OCW Information and Entropy", "derived_from_sources": [ "mit-ocw-6-050j-course-home", + "mit-ocw-6-050j-syllabus", "mit-ocw-6-050j-unit-8-textbook", "mit-ocw-6-050j-unit-13-textbook" ], diff --git a/examples/ocw-information-entropy/course/course-home.md b/examples/ocw-information-entropy/course/course-home.md new file mode 100644 index 0000000..3ebbdb8 --- /dev/null +++ b/examples/ocw-information-entropy/course/course-home.md @@ -0,0 +1,25 @@ +# MIT OCW 6.050J Information and Entropy: Course Home + +Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/ +Attribution: adapted from the MIT OpenCourseWare course home page for 6.050J Information and Entropy. + +## Course Identity + +### Information and Entropy +- Objective: Identify the course title, instructors, departments, level, and major topical areas. +- Exercise: Summarize the course in one paragraph for a prospective learner. +MIT OpenCourseWare presents 6.050J Information and Entropy as a Spring 2008 undergraduate subject taught by Paul Penfield and Seth Lloyd in Electrical Engineering and Computer Science together with Mechanical Engineering. The catalog framing emphasizes theory of computation, signal processing, and mathematical reasoning about information. + +## Course Description + +### Ultimate Limits to Communication and Computation +- Objective: Explain the broad intellectual scope of the course. +- Exercise: List the main topic clusters that connect communication, computation, and entropy. +The course examines the ultimate limits to communication and computation with emphasis on the physical nature of information processing. The source description highlights information and computation, digital signals, codes and compression, noise, probability, error correction, reversible and irreversible operations, physics of computation, and quantum computation. Entropy is explicitly connected both to channel capacity and to the second law of thermodynamics. + +## Resource Types + +### Open Textbooks, Problem Sets, and Programming Work +- Objective: Identify the main kinds of learning resources supplied through the course. +- Exercise: Explain how these resource types support both conceptual study and practice. +The course home lists open textbooks, problem sets, problem set solutions, and programming assignments. A learner using Didactopus should treat these as complementary evidence sources rather than relying on one summary alone. diff --git a/examples/ocw-information-entropy/course/syllabus.md b/examples/ocw-information-entropy/course/syllabus.md new file mode 100644 index 0000000..9a9d0fd --- /dev/null +++ b/examples/ocw-information-entropy/course/syllabus.md @@ -0,0 +1,30 @@ +# MIT OCW 6.050J Information and Entropy: Syllabus + +Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/ +Attribution: adapted from the MIT OpenCourseWare syllabus page for 6.050J Information and Entropy. + +## Course Logistics + +### Prerequisites and Mathematical Background +- Objective: Explain the mathematical maturity expected by the course. +- Exercise: Decide whether a learner needs review in probability, linear algebra, or signals before beginning. +The syllabus expects a foundation comparable to MIT subjects in calculus and linear algebra, together with comfort in probability, signals, and basic programming. Didactopus should therefore surface prerequisite review when those foundations appear weak. + +### Assessment Structure +- Objective: Identify the role of problem sets, exams, and programming work in the course. +- Exercise: Build a study schedule that alternates reading, derivation, and worked exercises. +The syllabus emphasizes regular problem solving and quantitative reasoning. The course is not only a reading list: learners are expected to derive results, solve structured problems, and connect abstract arguments to implementation-oriented tasks. + +## Reading Base + +### Course Notes and Reference Texts +- Objective: Explain how the course notes and textbook references supply the core conceptual sequence. +- Exercise: Compare when to use course notes versus outside references for clarification. +MIT OCW links course notes and textbook-style resources through the syllabus and resource pages. The intended use is cumulative: earlier notes establish counting, probability, and entropy, while later materials expand into coding, noise, secrecy, thermodynamics, and computation. + +## Learning Norms + +### Independent Reasoning and Careful Comparison +- Objective: Explain why the course requires precise comparison of related but non-identical concepts. +- Exercise: Write a short note distinguishing Shannon entropy, channel capacity, and thermodynamic entropy. +The syllabus framing implies a style of work where analogy is useful but dangerous when used loosely. Learners must compare models carefully, state assumptions, and notice where similar mathematics does not imply identical interpretation. diff --git a/examples/ocw-information-entropy/course/unit-sequence.md b/examples/ocw-information-entropy/course/unit-sequence.md new file mode 100644 index 0000000..af8ddee --- /dev/null +++ b/examples/ocw-information-entropy/course/unit-sequence.md @@ -0,0 +1,72 @@ +# MIT OCW 6.050J Information and Entropy: Unit Sequence + +Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/ +Attribution: adapted from the MIT OpenCourseWare unit progression and resource organization for 6.050J Information and Entropy. + +## Foundations + +### Counting and Probability +- Objective: Explain how counting arguments, probability spaces, and random variables support later information-theory results. +- Exercise: Derive a simple counting argument for binary strings and compute an event probability. +Early units establish counting, combinatorics, and probability as the language used to reason about uncertainty, messages, and evidence. + +### Shannon Entropy +- Objective: Explain Shannon entropy as a measure of uncertainty and compare high-entropy and low-entropy sources. +- Exercise: Compute the entropy of a Bernoulli source and interpret the result. +The course then introduces entropy as a quantitative measure of uncertainty for a source model and uses it to reason about representation cost and surprise. + +### Mutual Information +- Objective: Explain mutual information and relate it to dependence between signals or observations. +- Exercise: Compare independent variables with dependent variables using mutual-information reasoning. +These units ask the learner to understand how observation changes uncertainty and what it means for one variable to carry information about another. + +## Coding and Compression + +### Source Coding and Compression +- Objective: Explain lossless compression in terms of entropy, redundancy, and coding choices. +- Exercise: Describe when compression succeeds and when it fails on already-random data. +The course develops the idea that structured sources can often be described more efficiently, but only up to limits implied by entropy. + +### Huffman Coding +- Objective: Explain Huffman coding and justify why likely symbols receive shorter descriptions. +- Exercise: Build a Huffman code for a small source alphabet. +Learners use trees and expected length arguments to connect probability models to practical code design. + +## Communication Under Noise + +### Channel Capacity +- Objective: Explain channel capacity as a limit on reliable communication over a noisy channel. +- Exercise: State why reliable transmission above capacity is impossible in the long run. +The course treats capacity as a fundamental upper bound and frames noisy communication in terms of rates, inference, and uncertainty reduction. + +### Channel Coding +- Objective: Explain how channel coding adds redundancy to protect messages from noise. +- Exercise: Contrast uncoded transmission with coded transmission on a noisy channel. +These units emphasize that redundancy can be wasteful in compression but essential in communication under uncertainty. + +### Error Correcting Codes +- Objective: Explain how error-correcting codes detect or repair corrupted symbols. +- Exercise: Describe a simple parity-style code and its limits. +The learner must connect abstract limits to concrete coding mechanisms and understand both strengths and failure modes. + +## Broader Applications + +### Cryptography and Information Hiding +- Objective: Explain the relationship between secrecy, information leakage, and coded communication. +- Exercise: Compare a secure scheme with a weak one in terms of revealed information. +The course extends information-theoretic reasoning to adversarial settings where controlling what an observer can infer becomes central. + +### Thermodynamics and Entropy +- Objective: Explain how thermodynamic entropy relates to, and differs from, Shannon entropy. +- Exercise: Compare the two entropy notions and identify what is preserved across the analogy. +The course uses entropy as a bridge concept between communication theory and physics while insisting on careful interpretation. + +### Reversible Computation and Quantum Computation +- Objective: Explain why the physical implementation of computation matters for information processing limits. +- Exercise: Summarize how reversible computation changes the discussion of dissipation and information loss. +Later units connect information, entropy, and computation more directly by considering reversible logic, irreversibility, and quantum information themes. + +### Course Synthesis +- Objective: Synthesize the course by connecting entropy, coding, reliability, secrecy, and physical interpretation in one coherent narrative. +- Exercise: Produce a final study guide that links source coding, channel coding, secrecy, thermodynamic analogies, and computation. +The end of the course asks the learner to unify the mathematical and physical perspectives rather than treating the units as disconnected topics. diff --git a/examples/ocw-information-entropy/sources.yaml b/examples/ocw-information-entropy/sources.yaml index 8423411..adaa0e4 100644 --- a/examples/ocw-information-entropy/sources.yaml +++ b/examples/ocw-information-entropy/sources.yaml @@ -6,7 +6,20 @@ sources: creator: MIT OpenCourseWare license_id: CC BY-NC-SA 4.0 license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/ - retrieved_at: "2026-03-14" + retrieved_at: "2026-03-16" + adapted: true + attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0. + excluded_from_upstream_license: false + exclusion_notes: "" + + - source_id: mit-ocw-6-050j-syllabus + title: MIT OpenCourseWare 6.050J Information and Entropy syllabus + url: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/ + publisher: Massachusetts Institute of Technology + creator: MIT OpenCourseWare + license_id: CC BY-NC-SA 4.0 + license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/ + retrieved_at: "2026-03-16" adapted: true attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0. excluded_from_upstream_license: false @@ -19,7 +32,7 @@ sources: creator: MIT OpenCourseWare license_id: CC BY-NC-SA 4.0 license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/ - retrieved_at: "2026-03-14" + retrieved_at: "2026-03-16" adapted: true attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0. excluded_from_upstream_license: false @@ -32,7 +45,7 @@ sources: creator: MIT OpenCourseWare license_id: CC BY-NC-SA 4.0 license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/ - retrieved_at: "2026-03-14" + retrieved_at: "2026-03-16" adapted: true attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0. excluded_from_upstream_license: false diff --git a/skills/ocw-information-entropy-agent/SKILL.md b/skills/ocw-information-entropy-agent/SKILL.md index 0c7e8de..fb15dcd 100644 --- a/skills/ocw-information-entropy-agent/SKILL.md +++ b/skills/ocw-information-entropy-agent/SKILL.md @@ -12,8 +12,9 @@ Use this skill when the task is about tutoring, evaluating, or planning study in 1. Read `references/generated-course-summary.md` for the pack structure and target concepts. 2. Read `references/generated-capability-summary.md` to understand what the demo AI learner already mastered. 3. Use `assets/generated/pack/` as the source of truth for concept ids, prerequisites, and mastery signals. -4. When giving guidance, preserve the pack ordering from fundamentals through coding and thermodynamics. -5. When uncertain, say which concept or prerequisite in the generated pack is underspecified. +4. Use `assets/generated/pack/source_corpus.json` to ground explanations in the ingested source material before relying on model prior knowledge. +5. When giving guidance, preserve the pack ordering from fundamentals through coding and thermodynamics. +6. When uncertain, say which concept or prerequisite in the generated pack is underspecified and which source fragment would need review. ## Outputs diff --git a/skills/ocw-information-entropy-agent/assets/generated/pack/concepts.yaml b/skills/ocw-information-entropy-agent/assets/generated/pack/concepts.yaml index e6a84ed..b033978 100644 --- a/skills/ocw-information-entropy-agent/assets/generated/pack/concepts.yaml +++ b/skills/ocw-information-entropy-agent/assets/generated/pack/concepts.yaml @@ -1,54 +1,203 @@ concepts: -- id: mit-ocw-6-050j-information-and-entropy - title: MIT OCW 6.050J Information and Entropy - description: 'Source: MIT OpenCourseWare 6.050J Information and Entropy, Spring - 2008. +- id: mit-ocw-6-050j-information-and-entropy-course-home + title: 'MIT OCW 6.050J Information and Entropy: Course Home' + description: 'Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/ - Attribution: adapted from the OCW course overview, unit sequence, and assigned - textbook references.' + Attribution: adapted from the MIT OpenCourseWare course home page for 6.050J Information + and Entropy.' prerequisites: [] mastery_signals: [] mastery_profile: {} -- id: information - title: Information - description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information - and Entropy'. +- id: information-and-entropy + title: Information and Entropy + description: '- Objective: Identify the course title, instructors, departments, + level, and major topical areas. + + - Exercise: Summarize the course in one paragraph for a prospective learner. + + MIT OpenCourseWare presents 6.050J Information and Entropy as a S' + prerequisites: + - mit-ocw-6-050j-information-and-entropy-course-home + mastery_signals: + - Identify the course title, instructors, departments, level, and major topical + areas. + mastery_profile: {} +- id: paul + title: Paul + description: Candidate concept extracted from lesson 'Information and Entropy'. prerequisites: [] - mastery_signals: [] + mastery_signals: + - Identify the course title, instructors, departments, level, and major topical + areas. + mastery_profile: {} +- id: penfield + title: Penfield + description: Candidate concept extracted from lesson 'Information and Entropy'. + prerequisites: [] + mastery_signals: + - Identify the course title, instructors, departments, level, and major topical + areas. + mastery_profile: {} +- id: seth + title: Seth + description: Candidate concept extracted from lesson 'Information and Entropy'. + prerequisites: [] + mastery_signals: + - Identify the course title, instructors, departments, level, and major topical + areas. + mastery_profile: {} +- id: lloyd + title: Lloyd + description: Candidate concept extracted from lesson 'Information and Entropy'. + prerequisites: [] + mastery_signals: + - Identify the course title, instructors, departments, level, and major topical + areas. + mastery_profile: {} +- id: electrical + title: Electrical + description: Candidate concept extracted from lesson 'Information and Entropy'. + prerequisites: [] + mastery_signals: + - Identify the course title, instructors, departments, level, and major topical + areas. + mastery_profile: {} +- id: engineering + title: Engineering + description: Candidate concept extracted from lesson 'Information and Entropy'. + prerequisites: [] + mastery_signals: + - Identify the course title, instructors, departments, level, and major topical + areas. + mastery_profile: {} +- id: ultimate-limits-to-communication-and-computation + title: Ultimate Limits to Communication and Computation + description: '- Objective: Explain the broad intellectual scope of the course. + + - Exercise: List the main topic clusters that connect communication, computation, + and entropy. + + The course examines the ultimate limits to communication and computation with + em' + prerequisites: + - information-and-entropy + mastery_signals: + - Explain the broad intellectual scope of the course. mastery_profile: {} - id: entropy title: Entropy - description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information - and Entropy'. + description: Candidate concept extracted from lesson 'Ultimate Limits to Communication + and Computation'. prerequisites: [] + mastery_signals: + - Explain the broad intellectual scope of the course. + mastery_profile: {} +- id: open-textbooks-problem-sets-and-programming-work + title: Open Textbooks, Problem Sets, and Programming Work + description: '- Objective: Identify the main kinds of learning resources supplied + through the course. + + - Exercise: Explain how these resource types support both conceptual study and + practice. + + The course home lists open textbooks, problem sets, problem set' + prerequisites: + - ultimate-limits-to-communication-and-computation + mastery_signals: + - Identify the main kinds of learning resources supplied through the course. + mastery_profile: {} +- id: mit-ocw-6-050j-information-and-entropy-syllabus + title: 'MIT OCW 6.050J Information and Entropy: Syllabus' + description: 'Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/ + + Attribution: adapted from the MIT OpenCourseWare syllabus page for 6.050J Information + and Entropy.' + prerequisites: + - open-textbooks-problem-sets-and-programming-work mastery_signals: [] mastery_profile: {} -- id: source - title: Source - description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information - and Entropy'. - prerequisites: [] - mastery_signals: [] +- id: prerequisites-and-mathematical-background + title: Prerequisites and Mathematical Background + description: '- Objective: Explain the mathematical maturity expected by the course. + + - Exercise: Decide whether a learner needs review in probability, linear algebra, + or signals before beginning. + + The syllabus expects a foundation comparable to MIT subjec' + prerequisites: + - mit-ocw-6-050j-information-and-entropy-syllabus + mastery_signals: + - Explain the mathematical maturity expected by the course. mastery_profile: {} -- id: opencourseware - title: OpenCourseWare - description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information - and Entropy'. - prerequisites: [] - mastery_signals: [] +- id: assessment-structure + title: Assessment Structure + description: '- Objective: Identify the role of problem sets, exams, and programming + work in the course. + + - Exercise: Build a study schedule that alternates reading, derivation, and worked + exercises. + + The syllabus emphasizes regular problem solving and qua' + prerequisites: + - prerequisites-and-mathematical-background + mastery_signals: + - Identify the role of problem sets, exams, and programming work in the course. mastery_profile: {} -- id: spring - title: Spring - description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information - and Entropy'. - prerequisites: [] - mastery_signals: [] +- id: course-notes-and-reference-texts + title: Course Notes and Reference Texts + description: '- Objective: Explain how the course notes and textbook references + supply the core conceptual sequence. + + - Exercise: Compare when to use course notes versus outside references for clarification. + + MIT OCW links course notes and textbook-style r' + prerequisites: + - assessment-structure + mastery_signals: + - Explain how the course notes and textbook references supply the core conceptual + sequence. mastery_profile: {} -- id: attribution - title: Attribution - description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information - and Entropy'. +- id: independent-reasoning-and-careful-comparison + title: Independent Reasoning and Careful Comparison + description: '- Objective: Explain why the course requires precise comparison of + related but non-identical concepts. + + - Exercise: Write a short note distinguishing Shannon entropy, channel capacity, + and thermodynamic entropy. + + The syllabus framing implies' + prerequisites: + - course-notes-and-reference-texts + mastery_signals: + - Explain why the course requires precise comparison of related but non-identical + concepts. + mastery_profile: {} +- id: shannon + title: Shannon + description: Candidate concept extracted from lesson 'Independent Reasoning and + Careful Comparison'. prerequisites: [] + mastery_signals: + - Explain why the course requires precise comparison of related but non-identical + concepts. + mastery_profile: {} +- id: learners + title: Learners + description: Candidate concept extracted from lesson 'Independent Reasoning and + Careful Comparison'. + prerequisites: [] + mastery_signals: + - Explain why the course requires precise comparison of related but non-identical + concepts. + mastery_profile: {} +- id: mit-ocw-6-050j-information-and-entropy-unit-sequence + title: 'MIT OCW 6.050J Information and Entropy: Unit Sequence' + description: 'Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/ + + Attribution: adapted from the MIT OpenCourseWare unit progression and resource + organization for 6.050J Information and Entropy.' + prerequisites: + - independent-reasoning-and-careful-comparison mastery_signals: [] mastery_profile: {} - id: counting-and-probability @@ -59,284 +208,132 @@ concepts: - Exercise: Derive a simple counting argument for binary strings and compute an event probability. - This lesson i' + Early units e' prerequisites: - - mit-ocw-6-050j-information-and-entropy - mastery_signals: [] - mastery_profile: {} -- id: counting - title: Counting - description: Candidate concept extracted from lesson 'Counting and Probability'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: probability - title: Probability - description: Candidate concept extracted from lesson 'Counting and Probability'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: objective - title: Objective - description: Candidate concept extracted from lesson 'Counting and Probability'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: explain - title: Explain - description: Candidate concept extracted from lesson 'Counting and Probability'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: exercise - title: Exercise - description: Candidate concept extracted from lesson 'Counting and Probability'. - prerequisites: [] - mastery_signals: [] + - mit-ocw-6-050j-information-and-entropy-unit-sequence + mastery_signals: + - Explain how counting arguments, probability spaces, and random variables support + later information-theory results. mastery_profile: {} - id: derive title: Derive description: Candidate concept extracted from lesson 'Counting and Probability'. prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: this - title: This - description: Candidate concept extracted from lesson 'Counting and Probability'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: random - title: Random - description: Candidate concept extracted from lesson 'Counting and Probability'. - prerequisites: [] - mastery_signals: [] + mastery_signals: + - Explain how counting arguments, probability spaces, and random variables support + later information-theory results. mastery_profile: {} - id: shannon-entropy title: Shannon Entropy - description: '- Objective: Explain Shannon Entropy as a measure of uncertainty and + description: '- Objective: Explain Shannon entropy as a measure of uncertainty and compare high-entropy and low-entropy sources. - Exercise: Compute the entropy of a Bernoulli source and interpret the result. - This lesson centers Shannon Entropy, Surprise' + The course then introduces entropy as a quant' prerequisites: - counting-and-probability - mastery_signals: [] - mastery_profile: {} -- id: shannon - title: Shannon - description: Candidate concept extracted from lesson 'Shannon Entropy'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: compute - title: Compute - description: Candidate concept extracted from lesson 'Shannon Entropy'. - prerequisites: [] - mastery_signals: [] + mastery_signals: + - Explain Shannon entropy as a measure of uncertainty and compare high-entropy and + low-entropy sources. mastery_profile: {} - id: bernoulli title: Bernoulli description: Candidate concept extracted from lesson 'Shannon Entropy'. prerequisites: [] - mastery_signals: [] + mastery_signals: + - Explain Shannon entropy as a measure of uncertainty and compare high-entropy and + low-entropy sources. mastery_profile: {} - id: mutual-information title: Mutual Information - description: '- Objective: Explain Mutual Information and relate it to dependence - between signals. + description: '- Objective: Explain mutual information and relate it to dependence + between signals or observations. - Exercise: Compare independent variables with dependent variables using mutual-information reasoning. - This lesson introduces Mutual Information, Dependenc' + These units ask the learner to under' prerequisites: - shannon-entropy - mastery_signals: [] + mastery_signals: + - Explain mutual information and relate it to dependence between signals or observations. mastery_profile: {} -- id: mutual - title: Mutual - description: Candidate concept extracted from lesson 'Mutual Information'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: compare - title: Compare - description: Candidate concept extracted from lesson 'Mutual Information'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: dependence - title: Dependence - description: Candidate concept extracted from lesson 'Mutual Information'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: data-compression - title: Data Compression - description: '- Objective: Explain lossless compression in terms of entropy and - typical structure. +- id: source-coding-and-compression + title: Source Coding and Compression + description: '- Objective: Explain lossless compression in terms of entropy, redundancy, + and coding choices. - Exercise: Describe when compression succeeds and when it fails on already-random data. - This lesson covers Data Compression, Redundancy, and Efficient Rep' + The course develops the idea that structured sources can' prerequisites: - mutual-information - mastery_signals: [] - mastery_profile: {} -- id: data - title: Data - description: Candidate concept extracted from lesson 'Data Compression'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: compression - title: Compression - description: Candidate concept extracted from lesson 'Data Compression'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: describe - title: Describe - description: Candidate concept extracted from lesson 'Data Compression'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: redundancy - title: Redundancy - description: Candidate concept extracted from lesson 'Data Compression'. - prerequisites: [] - mastery_signals: [] + mastery_signals: + - Explain lossless compression in terms of entropy, redundancy, and coding choices. mastery_profile: {} - id: huffman-coding title: Huffman Coding - description: '- Objective: Explain Huffman Coding and justify why shorter codewords - should track more likely symbols. + description: '- Objective: Explain Huffman coding and justify why likely symbols + receive shorter descriptions. - Exercise: Build a Huffman code for a small source alphabet. - This lesson focuses on Huffman Coding, Prefix Codes, and Expected Length.' + Learners use trees and expected length arguments to connect probability models + to' prerequisites: - - data-compression - mastery_signals: [] - mastery_profile: {} -- id: huffman - title: Huffman - description: Candidate concept extracted from lesson 'Huffman Coding'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: coding - title: Coding - description: Candidate concept extracted from lesson 'Huffman Coding'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: build - title: Build - description: Candidate concept extracted from lesson 'Huffman Coding'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: prefix - title: Prefix - description: Candidate concept extracted from lesson 'Huffman Coding'. - prerequisites: [] - mastery_signals: [] + - source-coding-and-compression + mastery_signals: + - Explain Huffman coding and justify why likely symbols receive shorter descriptions. mastery_profile: {} - id: channel-capacity title: Channel Capacity - description: '- Objective: Explain Channel Capacity as a limit on reliable communication - over noisy channels. + description: '- Objective: Explain channel capacity as a limit on reliable communication + over a noisy channel. - Exercise: State why reliable transmission above capacity is impossible in the long run. - This lesson develops Channel Capacity, Reliable Commun' + The course treats capacity as a fundamental upper bou' prerequisites: - huffman-coding - mastery_signals: [] - mastery_profile: {} -- id: channel - title: Channel - description: Candidate concept extracted from lesson 'Channel Capacity'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: capacity - title: Capacity - description: Candidate concept extracted from lesson 'Channel Capacity'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: state - title: State - description: Candidate concept extracted from lesson 'Channel Capacity'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: reliable - title: Reliable - description: Candidate concept extracted from lesson 'Channel Capacity'. - prerequisites: [] - mastery_signals: [] + mastery_signals: + - Explain channel capacity as a limit on reliable communication over a noisy channel. mastery_profile: {} - id: channel-coding title: Channel Coding - description: '- Objective: Explain how Channel Coding adds structure that protects - messages against noise. + description: '- Objective: Explain how channel coding adds redundancy to protect + messages from noise. - Exercise: Contrast uncoded transmission with coded transmission on a noisy channel. - This lesson connects Channel Coding, Decoding, and Reliabilit' + These units emphasize that redundancy can be wasteful in compressi' prerequisites: - channel-capacity - mastery_signals: [] + mastery_signals: + - Explain how channel coding adds redundancy to protect messages from noise. mastery_profile: {} - id: contrast title: Contrast description: Candidate concept extracted from lesson 'Channel Coding'. prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: decoding - title: Decoding - description: Candidate concept extracted from lesson 'Channel Coding'. - prerequisites: [] - mastery_signals: [] + mastery_signals: + - Explain how channel coding adds redundancy to protect messages from noise. mastery_profile: {} - id: error-correcting-codes title: Error Correcting Codes - description: '- Objective: Explain how Error Correcting Codes detect or correct - symbol corruption. + description: '- Objective: Explain how error-correcting codes detect or repair corrupted + symbols. - Exercise: Describe a simple parity-style code and its limits. - This lesson covers Error Correcting Codes, Parity, and Syndrome-style reasoning. - The learne' + The learner must connect abstract limits to concrete coding mechanisms and understand + both s' prerequisites: - channel-coding - mastery_signals: [] - mastery_profile: {} -- id: error - title: Error - description: Candidate concept extracted from lesson 'Error Correcting Codes'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: correcting - title: Correcting - description: Candidate concept extracted from lesson 'Error Correcting Codes'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: codes - title: Codes - description: Candidate concept extracted from lesson 'Error Correcting Codes'. - prerequisites: [] - mastery_signals: [] + mastery_signals: + - Explain how error-correcting codes detect or repair corrupted symbols. mastery_profile: {} - id: cryptography-and-information-hiding title: Cryptography and Information Hiding @@ -345,24 +342,11 @@ concepts: - Exercise: Compare a secure scheme with a weak one in terms of revealed information. - This lesson combines Cryptography, Information Leakag' + The course extends information-theoretic reasoning to' prerequisites: - error-correcting-codes - mastery_signals: [] - mastery_profile: {} -- id: cryptography - title: Cryptography - description: Candidate concept extracted from lesson 'Cryptography and Information - Hiding'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: hiding - title: Hiding - description: Candidate concept extracted from lesson 'Cryptography and Information - Hiding'. - prerequisites: [] - mastery_signals: [] + mastery_signals: + - Explain the relationship between secrecy, information leakage, and coded communication. mastery_profile: {} - id: thermodynamics-and-entropy title: Thermodynamics and Entropy @@ -372,49 +356,37 @@ concepts: - Exercise: Compare the two entropy notions and identify what is preserved across the analogy. - This lesson connects Thermodynamics, Entropy, and P' + The course uses entropy as a bridge concept between' prerequisites: - cryptography-and-information-hiding - mastery_signals: [] + mastery_signals: + - Explain how thermodynamic entropy relates to, and differs from, Shannon entropy. mastery_profile: {} -- id: thermodynamics - title: Thermodynamics - description: Candidate concept extracted from lesson 'Thermodynamics and Entropy'. - prerequisites: [] - mastery_signals: [] +- id: reversible-computation-and-quantum-computation + title: Reversible Computation and Quantum Computation + description: '- Objective: Explain why the physical implementation of computation + matters for information processing limits. + + - Exercise: Summarize how reversible computation changes the discussion of dissipation + and information loss. + + Later units connect' + prerequisites: + - thermodynamics-and-entropy + mastery_signals: + - Explain why the physical implementation of computation matters for information + processing limits. mastery_profile: {} - id: course-synthesis title: Course Synthesis description: '- Objective: Synthesize the course by connecting entropy, coding, - reliability, and physical interpretation in one coherent narrative. + reliability, secrecy, and physical interpretation in one coherent narrative. - Exercise: Produce a final study guide that links source coding, channel coding, - secrecy, and thermodynam' + secrecy, thermo' prerequisites: - - thermodynamics-and-entropy - mastery_signals: [] - mastery_profile: {} -- id: course - title: Course - description: Candidate concept extracted from lesson 'Course Synthesis'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: synthesis - title: Synthesis - description: Candidate concept extracted from lesson 'Course Synthesis'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: synthesize - title: Synthesize - description: Candidate concept extracted from lesson 'Course Synthesis'. - prerequisites: [] - mastery_signals: [] - mastery_profile: {} -- id: produce - title: Produce - description: Candidate concept extracted from lesson 'Course Synthesis'. - prerequisites: [] - mastery_signals: [] + - reversible-computation-and-quantum-computation + mastery_signals: + - Synthesize the course by connecting entropy, coding, reliability, secrecy, and + physical interpretation in one coherent narrative. mastery_profile: {} diff --git a/skills/ocw-information-entropy-agent/assets/generated/pack/license_attribution.json b/skills/ocw-information-entropy-agent/assets/generated/pack/license_attribution.json index c2ffc0c..7b35fd3 100644 --- a/skills/ocw-information-entropy-agent/assets/generated/pack/license_attribution.json +++ b/skills/ocw-information-entropy-agent/assets/generated/pack/license_attribution.json @@ -2,9 +2,19 @@ "rights_note": "Derived from MIT OpenCourseWare 6.050J Information and Entropy (Spring 2008). Retain MIT OCW attribution and applicable Creative Commons terms before redistribution.", "sources": [ { - "source_path": "/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy/6-050j-information-and-entropy.md", + "source_path": "examples/ocw-information-entropy/course/course-home.md", "source_type": "markdown", - "title": "6 050J Information And Entropy" + "title": "Course Home" + }, + { + "source_path": "examples/ocw-information-entropy/course/syllabus.md", + "source_type": "markdown", + "title": "Syllabus" + }, + { + "source_path": "examples/ocw-information-entropy/course/unit-sequence.md", + "source_type": "markdown", + "title": "Unit Sequence" } ] } \ No newline at end of file diff --git a/skills/ocw-information-entropy-agent/assets/generated/pack/pack.yaml b/skills/ocw-information-entropy-agent/assets/generated/pack/pack.yaml index fa010da..7317be6 100644 --- a/skills/ocw-information-entropy-agent/assets/generated/pack/pack.yaml +++ b/skills/ocw-information-entropy-agent/assets/generated/pack/pack.yaml @@ -12,3 +12,5 @@ dependencies: [] overrides: [] profile_templates: {} cross_pack_links: [] +supporting_artifacts: +- source_corpus.json diff --git a/skills/ocw-information-entropy-agent/assets/generated/pack/pack_compliance_manifest.json b/skills/ocw-information-entropy-agent/assets/generated/pack/pack_compliance_manifest.json index 6c71ee5..d64c7b1 100644 --- a/skills/ocw-information-entropy-agent/assets/generated/pack/pack_compliance_manifest.json +++ b/skills/ocw-information-entropy-agent/assets/generated/pack/pack_compliance_manifest.json @@ -3,6 +3,7 @@ "display_name": "MIT OCW Information and Entropy", "derived_from_sources": [ "mit-ocw-6-050j-course-home", + "mit-ocw-6-050j-syllabus", "mit-ocw-6-050j-unit-8-textbook", "mit-ocw-6-050j-unit-13-textbook" ], diff --git a/skills/ocw-information-entropy-agent/assets/generated/pack/review_report.md b/skills/ocw-information-entropy-agent/assets/generated/pack/review_report.md index fcc415c..989beb1 100644 --- a/skills/ocw-information-entropy-agent/assets/generated/pack/review_report.md +++ b/skills/ocw-information-entropy-agent/assets/generated/pack/review_report.md @@ -1,59 +1,5 @@ # Review Report -- Module 'Imported from MARKDOWN' has no explicit exercises; mastery signals may be weak. -- Concept 'MIT OCW 6.050J Information and Entropy' has no extracted mastery signals; review manually. -- Concept 'Information' has no extracted mastery signals; review manually. -- Concept 'Entropy' has no extracted mastery signals; review manually. -- Concept 'Source' has no extracted mastery signals; review manually. -- Concept 'OpenCourseWare' has no extracted mastery signals; review manually. -- Concept 'Spring' has no extracted mastery signals; review manually. -- Concept 'Attribution' has no extracted mastery signals; review manually. -- Concept 'Counting and Probability' has no extracted mastery signals; review manually. -- Concept 'Counting' has no extracted mastery signals; review manually. -- Concept 'Probability' has no extracted mastery signals; review manually. -- Concept 'Objective' has no extracted mastery signals; review manually. -- Concept 'Explain' has no extracted mastery signals; review manually. -- Concept 'Exercise' has no extracted mastery signals; review manually. -- Concept 'Derive' has no extracted mastery signals; review manually. -- Concept 'This' has no extracted mastery signals; review manually. -- Concept 'Random' has no extracted mastery signals; review manually. -- Concept 'Shannon Entropy' has no extracted mastery signals; review manually. -- Concept 'Shannon' has no extracted mastery signals; review manually. -- Concept 'Compute' has no extracted mastery signals; review manually. -- Concept 'Bernoulli' has no extracted mastery signals; review manually. -- Concept 'Mutual Information' has no extracted mastery signals; review manually. -- Concept 'Mutual' has no extracted mastery signals; review manually. -- Concept 'Compare' has no extracted mastery signals; review manually. -- Concept 'Dependence' has no extracted mastery signals; review manually. -- Concept 'Data Compression' has no extracted mastery signals; review manually. -- Concept 'Data' has no extracted mastery signals; review manually. -- Concept 'Compression' has no extracted mastery signals; review manually. -- Concept 'Describe' has no extracted mastery signals; review manually. -- Concept 'Redundancy' has no extracted mastery signals; review manually. -- Concept 'Huffman Coding' has no extracted mastery signals; review manually. -- Concept 'Huffman' has no extracted mastery signals; review manually. -- Concept 'Coding' has no extracted mastery signals; review manually. -- Concept 'Build' has no extracted mastery signals; review manually. -- Concept 'Prefix' has no extracted mastery signals; review manually. -- Concept 'Channel Capacity' has no extracted mastery signals; review manually. -- Concept 'Channel' has no extracted mastery signals; review manually. -- Concept 'Capacity' has no extracted mastery signals; review manually. -- Concept 'State' has no extracted mastery signals; review manually. -- Concept 'Reliable' has no extracted mastery signals; review manually. -- Concept 'Channel Coding' has no extracted mastery signals; review manually. -- Concept 'Contrast' has no extracted mastery signals; review manually. -- Concept 'Decoding' has no extracted mastery signals; review manually. -- Concept 'Error Correcting Codes' has no extracted mastery signals; review manually. -- Concept 'Error' has no extracted mastery signals; review manually. -- Concept 'Correcting' has no extracted mastery signals; review manually. -- Concept 'Codes' has no extracted mastery signals; review manually. -- Concept 'Cryptography and Information Hiding' has no extracted mastery signals; review manually. -- Concept 'Cryptography' has no extracted mastery signals; review manually. -- Concept 'Hiding' has no extracted mastery signals; review manually. -- Concept 'Thermodynamics and Entropy' has no extracted mastery signals; review manually. -- Concept 'Thermodynamics' has no extracted mastery signals; review manually. -- Concept 'Course Synthesis' has no extracted mastery signals; review manually. -- Concept 'Course' has no extracted mastery signals; review manually. -- Concept 'Synthesis' has no extracted mastery signals; review manually. -- Concept 'Synthesize' has no extracted mastery signals; review manually. -- Concept 'Produce' has no extracted mastery signals; review manually. \ No newline at end of file +- Concept 'MIT OCW 6.050J Information and Entropy: Course Home' has no extracted mastery signals; review manually. +- Concept 'MIT OCW 6.050J Information and Entropy: Syllabus' has no extracted mastery signals; review manually. +- Concept 'MIT OCW 6.050J Information and Entropy: Unit Sequence' has no extracted mastery signals; review manually. \ No newline at end of file diff --git a/skills/ocw-information-entropy-agent/assets/generated/pack/roadmap.yaml b/skills/ocw-information-entropy-agent/assets/generated/pack/roadmap.yaml index 862dccb..5c13c99 100644 --- a/skills/ocw-information-entropy-agent/assets/generated/pack/roadmap.yaml +++ b/skills/ocw-information-entropy-agent/assets/generated/pack/roadmap.yaml @@ -2,16 +2,50 @@ stages: - id: stage-1 title: Imported from MARKDOWN concepts: - - mit-ocw-6-050j-information-and-entropy + - mit-ocw-6-050j-information-and-entropy-course-home + - information-and-entropy + - ultimate-limits-to-communication-and-computation + - open-textbooks-problem-sets-and-programming-work + - mit-ocw-6-050j-information-and-entropy-syllabus + - prerequisites-and-mathematical-background + - assessment-structure + - course-notes-and-reference-texts + - independent-reasoning-and-careful-comparison + - mit-ocw-6-050j-information-and-entropy-unit-sequence - counting-and-probability - shannon-entropy - mutual-information - - data-compression + - source-coding-and-compression - huffman-coding - channel-capacity - channel-coding - error-correcting-codes - cryptography-and-information-hiding - thermodynamics-and-entropy + - reversible-computation-and-quantum-computation - course-synthesis - checkpoint: [] + checkpoint: + - Summarize the course in one paragraph for a prospective learner. + - List the main topic clusters that connect communication, computation, and entropy. + - Explain how these resource types support both conceptual study and practice. + - Decide whether a learner needs review in probability, linear algebra, or signals + before beginning. + - Build a study schedule that alternates reading, derivation, and worked exercises. + - Compare when to use course notes versus outside references for clarification. + - Write a short note distinguishing Shannon entropy, channel capacity, and thermodynamic + entropy. + - Derive a simple counting argument for binary strings and compute an event probability. + - Compute the entropy of a Bernoulli source and interpret the result. + - Compare independent variables with dependent variables using mutual-information + reasoning. + - Describe when compression succeeds and when it fails on already-random data. + - Build a Huffman code for a small source alphabet. + - State why reliable transmission above capacity is impossible in the long run. + - Contrast uncoded transmission with coded transmission on a noisy channel. + - Describe a simple parity-style code and its limits. + - Compare a secure scheme with a weak one in terms of revealed information. + - Compare the two entropy notions and identify what is preserved across the analogy. + - Summarize how reversible computation changes the discussion of dissipation and + information loss. + - Produce a final study guide that links source coding, channel coding, secrecy, + thermodynamic analogies, and computation. diff --git a/skills/ocw-information-entropy-agent/assets/generated/pack/source_corpus.json b/skills/ocw-information-entropy-agent/assets/generated/pack/source_corpus.json new file mode 100644 index 0000000..beb558a --- /dev/null +++ b/skills/ocw-information-entropy-agent/assets/generated/pack/source_corpus.json @@ -0,0 +1,803 @@ +{ + "course_title": "MIT OCW Information and Entropy", + "rights_note": "Derived from MIT OpenCourseWare 6.050J Information and Entropy (Spring 2008). Retain MIT OCW attribution and applicable Creative Commons terms before redistribution.", + "sources": [ + { + "source_path": "examples/ocw-information-entropy/course/course-home.md", + "source_type": "markdown", + "title": "Course Home", + "metadata": {} + }, + { + "source_path": "examples/ocw-information-entropy/course/syllabus.md", + "source_type": "markdown", + "title": "Syllabus", + "metadata": {} + }, + { + "source_path": "examples/ocw-information-entropy/course/unit-sequence.md", + "source_type": "markdown", + "title": "Unit Sequence", + "metadata": {} + } + ], + "fragments": [ + { + "fragment_id": "imported-from-markdown::mit-ocw-6.050j-information-and-entropy:-course-home::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "MIT OCW 6.050J Information and Entropy: Course Home", + "text": "Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/\nAttribution: adapted from the MIT OpenCourseWare course home page for 6.050J Information and Entropy.", + "source_refs": [ + "examples/ocw-information-entropy/course/course-home.md" + ], + "objectives": [], + "exercises": [], + "key_terms": [ + "Information", + "Entropy" + ] + }, + { + "fragment_id": "imported-from-markdown::information-and-entropy::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Information and Entropy", + "text": "- Objective: Identify the course title, instructors, departments, level, and major topical areas.\n- Exercise: Summarize the course in one paragraph for a prospective learner.\nMIT OpenCourseWare presents 6.050J Information and Entropy as a Spring 2008 undergraduate subject taught by Paul Penfield and Seth Lloyd in Electrical Engineering and Computer Science together with Mechanical Engineering. The catalog framing emphasizes theory of computation, signal processing, and mathematical reasoning about information.", + "source_refs": [ + "examples/ocw-information-entropy/course/course-home.md" + ], + "objectives": [ + "Identify the course title, instructors, departments, level, and major topical areas." + ], + "exercises": [ + "Summarize the course in one paragraph for a prospective learner." + ], + "key_terms": [ + "Information", + "Entropy", + "Paul", + "Penfield", + "Seth", + "Lloyd", + "Electrical", + "Engineering" + ] + }, + { + "fragment_id": "imported-from-markdown::information-and-entropy::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Information and Entropy", + "text": "Identify the course title, instructors, departments, level, and major topical areas.", + "source_refs": [ + "examples/ocw-information-entropy/course/course-home.md" + ] + }, + { + "fragment_id": "imported-from-markdown::information-and-entropy::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Information and Entropy", + "text": "Summarize the course in one paragraph for a prospective learner.", + "source_refs": [ + "examples/ocw-information-entropy/course/course-home.md" + ] + }, + { + "fragment_id": "imported-from-markdown::ultimate-limits-to-communication-and-computation::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Ultimate Limits to Communication and Computation", + "text": "- Objective: Explain the broad intellectual scope of the course.\n- Exercise: List the main topic clusters that connect communication, computation, and entropy.\nThe course examines the ultimate limits to communication and computation with emphasis on the physical nature of information processing. The source description highlights information and computation, digital signals, codes and compression, noise, probability, error correction, reversible and irreversible operations, physics of computation, and quantum computation. Entropy is explicitly connected both to channel capacity and to the second law of thermodynamics.", + "source_refs": [ + "examples/ocw-information-entropy/course/course-home.md" + ], + "objectives": [ + "Explain the broad intellectual scope of the course." + ], + "exercises": [ + "List the main topic clusters that connect communication, computation, and entropy." + ], + "key_terms": [ + "Entropy" + ] + }, + { + "fragment_id": "imported-from-markdown::ultimate-limits-to-communication-and-computation::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Ultimate Limits to Communication and Computation", + "text": "Explain the broad intellectual scope of the course.", + "source_refs": [ + "examples/ocw-information-entropy/course/course-home.md" + ] + }, + { + "fragment_id": "imported-from-markdown::ultimate-limits-to-communication-and-computation::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Ultimate Limits to Communication and Computation", + "text": "List the main topic clusters that connect communication, computation, and entropy.", + "source_refs": [ + "examples/ocw-information-entropy/course/course-home.md" + ] + }, + { + "fragment_id": "imported-from-markdown::open-textbooks,-problem-sets,-and-programming-work::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Open Textbooks, Problem Sets, and Programming Work", + "text": "- Objective: Identify the main kinds of learning resources supplied through the course.\n- Exercise: Explain how these resource types support both conceptual study and practice.\nThe course home lists open textbooks, problem sets, problem set solutions, and programming assignments. A learner using Didactopus should treat these as complementary evidence sources rather than relying on one summary alone.", + "source_refs": [ + "examples/ocw-information-entropy/course/course-home.md" + ], + "objectives": [ + "Identify the main kinds of learning resources supplied through the course." + ], + "exercises": [ + "Explain how these resource types support both conceptual study and practice." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::open-textbooks,-problem-sets,-and-programming-work::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Open Textbooks, Problem Sets, and Programming Work", + "text": "Identify the main kinds of learning resources supplied through the course.", + "source_refs": [ + "examples/ocw-information-entropy/course/course-home.md" + ] + }, + { + "fragment_id": "imported-from-markdown::open-textbooks,-problem-sets,-and-programming-work::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Open Textbooks, Problem Sets, and Programming Work", + "text": "Explain how these resource types support both conceptual study and practice.", + "source_refs": [ + "examples/ocw-information-entropy/course/course-home.md" + ] + }, + { + "fragment_id": "imported-from-markdown::mit-ocw-6.050j-information-and-entropy:-syllabus::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "MIT OCW 6.050J Information and Entropy: Syllabus", + "text": "Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/\nAttribution: adapted from the MIT OpenCourseWare syllabus page for 6.050J Information and Entropy.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ], + "objectives": [], + "exercises": [], + "key_terms": [ + "Information", + "Entropy" + ] + }, + { + "fragment_id": "imported-from-markdown::prerequisites-and-mathematical-background::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Prerequisites and Mathematical Background", + "text": "- Objective: Explain the mathematical maturity expected by the course.\n- Exercise: Decide whether a learner needs review in probability, linear algebra, or signals before beginning.\nThe syllabus expects a foundation comparable to MIT subjects in calculus and linear algebra, together with comfort in probability, signals, and basic programming. Didactopus should therefore surface prerequisite review when those foundations appear weak.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ], + "objectives": [ + "Explain the mathematical maturity expected by the course." + ], + "exercises": [ + "Decide whether a learner needs review in probability, linear algebra, or signals before beginning." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::prerequisites-and-mathematical-background::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Prerequisites and Mathematical Background", + "text": "Explain the mathematical maturity expected by the course.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ] + }, + { + "fragment_id": "imported-from-markdown::prerequisites-and-mathematical-background::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Prerequisites and Mathematical Background", + "text": "Decide whether a learner needs review in probability, linear algebra, or signals before beginning.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ] + }, + { + "fragment_id": "imported-from-markdown::assessment-structure::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Assessment Structure", + "text": "- Objective: Identify the role of problem sets, exams, and programming work in the course.\n- Exercise: Build a study schedule that alternates reading, derivation, and worked exercises.\nThe syllabus emphasizes regular problem solving and quantitative reasoning. The course is not only a reading list: learners are expected to derive results, solve structured problems, and connect abstract arguments to implementation-oriented tasks.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ], + "objectives": [ + "Identify the role of problem sets, exams, and programming work in the course." + ], + "exercises": [ + "Build a study schedule that alternates reading, derivation, and worked exercises." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::assessment-structure::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Assessment Structure", + "text": "Identify the role of problem sets, exams, and programming work in the course.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ] + }, + { + "fragment_id": "imported-from-markdown::assessment-structure::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Assessment Structure", + "text": "Build a study schedule that alternates reading, derivation, and worked exercises.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ] + }, + { + "fragment_id": "imported-from-markdown::course-notes-and-reference-texts::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Course Notes and Reference Texts", + "text": "- Objective: Explain how the course notes and textbook references supply the core conceptual sequence.\n- Exercise: Compare when to use course notes versus outside references for clarification.\nMIT OCW links course notes and textbook-style resources through the syllabus and resource pages. The intended use is cumulative: earlier notes establish counting, probability, and entropy, while later materials expand into coding, noise, secrecy, thermodynamics, and computation.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ], + "objectives": [ + "Explain how the course notes and textbook references supply the core conceptual sequence." + ], + "exercises": [ + "Compare when to use course notes versus outside references for clarification." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::course-notes-and-reference-texts::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Course Notes and Reference Texts", + "text": "Explain how the course notes and textbook references supply the core conceptual sequence.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ] + }, + { + "fragment_id": "imported-from-markdown::course-notes-and-reference-texts::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Course Notes and Reference Texts", + "text": "Compare when to use course notes versus outside references for clarification.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ] + }, + { + "fragment_id": "imported-from-markdown::independent-reasoning-and-careful-comparison::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Independent Reasoning and Careful Comparison", + "text": "- Objective: Explain why the course requires precise comparison of related but non-identical concepts.\n- Exercise: Write a short note distinguishing Shannon entropy, channel capacity, and thermodynamic entropy.\nThe syllabus framing implies a style of work where analogy is useful but dangerous when used loosely. Learners must compare models carefully, state assumptions, and notice where similar mathematics does not imply identical interpretation.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ], + "objectives": [ + "Explain why the course requires precise comparison of related but non-identical concepts." + ], + "exercises": [ + "Write a short note distinguishing Shannon entropy, channel capacity, and thermodynamic entropy." + ], + "key_terms": [ + "Shannon", + "Learners" + ] + }, + { + "fragment_id": "imported-from-markdown::independent-reasoning-and-careful-comparison::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Independent Reasoning and Careful Comparison", + "text": "Explain why the course requires precise comparison of related but non-identical concepts.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ] + }, + { + "fragment_id": "imported-from-markdown::independent-reasoning-and-careful-comparison::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Independent Reasoning and Careful Comparison", + "text": "Write a short note distinguishing Shannon entropy, channel capacity, and thermodynamic entropy.", + "source_refs": [ + "examples/ocw-information-entropy/course/syllabus.md" + ] + }, + { + "fragment_id": "imported-from-markdown::mit-ocw-6.050j-information-and-entropy:-unit-sequence::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "MIT OCW 6.050J Information and Entropy: Unit Sequence", + "text": "Source: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/\nAttribution: adapted from the MIT OpenCourseWare unit progression and resource organization for 6.050J Information and Entropy.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [], + "exercises": [], + "key_terms": [ + "Information", + "Entropy" + ] + }, + { + "fragment_id": "imported-from-markdown::counting-and-probability::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Counting and Probability", + "text": "- Objective: Explain how counting arguments, probability spaces, and random variables support later information-theory results.\n- Exercise: Derive a simple counting argument for binary strings and compute an event probability.\nEarly units establish counting, combinatorics, and probability as the language used to reason about uncertainty, messages, and evidence.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain how counting arguments, probability spaces, and random variables support later information-theory results." + ], + "exercises": [ + "Derive a simple counting argument for binary strings and compute an event probability." + ], + "key_terms": [ + "Derive" + ] + }, + { + "fragment_id": "imported-from-markdown::counting-and-probability::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Counting and Probability", + "text": "Explain how counting arguments, probability spaces, and random variables support later information-theory results.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::counting-and-probability::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Counting and Probability", + "text": "Derive a simple counting argument for binary strings and compute an event probability.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::shannon-entropy::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Shannon Entropy", + "text": "- Objective: Explain Shannon entropy as a measure of uncertainty and compare high-entropy and low-entropy sources.\n- Exercise: Compute the entropy of a Bernoulli source and interpret the result.\nThe course then introduces entropy as a quantitative measure of uncertainty for a source model and uses it to reason about representation cost and surprise.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain Shannon entropy as a measure of uncertainty and compare high-entropy and low-entropy sources." + ], + "exercises": [ + "Compute the entropy of a Bernoulli source and interpret the result." + ], + "key_terms": [ + "Shannon", + "Bernoulli" + ] + }, + { + "fragment_id": "imported-from-markdown::shannon-entropy::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Shannon Entropy", + "text": "Explain Shannon entropy as a measure of uncertainty and compare high-entropy and low-entropy sources.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::shannon-entropy::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Shannon Entropy", + "text": "Compute the entropy of a Bernoulli source and interpret the result.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::mutual-information::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Mutual Information", + "text": "- Objective: Explain mutual information and relate it to dependence between signals or observations.\n- Exercise: Compare independent variables with dependent variables using mutual-information reasoning.\nThese units ask the learner to understand how observation changes uncertainty and what it means for one variable to carry information about another.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain mutual information and relate it to dependence between signals or observations." + ], + "exercises": [ + "Compare independent variables with dependent variables using mutual-information reasoning." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::mutual-information::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Mutual Information", + "text": "Explain mutual information and relate it to dependence between signals or observations.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::mutual-information::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Mutual Information", + "text": "Compare independent variables with dependent variables using mutual-information reasoning.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::source-coding-and-compression::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Source Coding and Compression", + "text": "- Objective: Explain lossless compression in terms of entropy, redundancy, and coding choices.\n- Exercise: Describe when compression succeeds and when it fails on already-random data.\nThe course develops the idea that structured sources can often be described more efficiently, but only up to limits implied by entropy.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain lossless compression in terms of entropy, redundancy, and coding choices." + ], + "exercises": [ + "Describe when compression succeeds and when it fails on already-random data." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::source-coding-and-compression::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Source Coding and Compression", + "text": "Explain lossless compression in terms of entropy, redundancy, and coding choices.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::source-coding-and-compression::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Source Coding and Compression", + "text": "Describe when compression succeeds and when it fails on already-random data.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::huffman-coding::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Huffman Coding", + "text": "- Objective: Explain Huffman coding and justify why likely symbols receive shorter descriptions.\n- Exercise: Build a Huffman code for a small source alphabet.\nLearners use trees and expected length arguments to connect probability models to practical code design.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain Huffman coding and justify why likely symbols receive shorter descriptions." + ], + "exercises": [ + "Build a Huffman code for a small source alphabet." + ], + "key_terms": [ + "Huffman", + "Learners" + ] + }, + { + "fragment_id": "imported-from-markdown::huffman-coding::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Huffman Coding", + "text": "Explain Huffman coding and justify why likely symbols receive shorter descriptions.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::huffman-coding::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Huffman Coding", + "text": "Build a Huffman code for a small source alphabet.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::channel-capacity::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Channel Capacity", + "text": "- Objective: Explain channel capacity as a limit on reliable communication over a noisy channel.\n- Exercise: State why reliable transmission above capacity is impossible in the long run.\nThe course treats capacity as a fundamental upper bound and frames noisy communication in terms of rates, inference, and uncertainty reduction.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain channel capacity as a limit on reliable communication over a noisy channel." + ], + "exercises": [ + "State why reliable transmission above capacity is impossible in the long run." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::channel-capacity::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Channel Capacity", + "text": "Explain channel capacity as a limit on reliable communication over a noisy channel.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::channel-capacity::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Channel Capacity", + "text": "State why reliable transmission above capacity is impossible in the long run.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::channel-coding::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Channel Coding", + "text": "- Objective: Explain how channel coding adds redundancy to protect messages from noise.\n- Exercise: Contrast uncoded transmission with coded transmission on a noisy channel.\nThese units emphasize that redundancy can be wasteful in compression but essential in communication under uncertainty.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain how channel coding adds redundancy to protect messages from noise." + ], + "exercises": [ + "Contrast uncoded transmission with coded transmission on a noisy channel." + ], + "key_terms": [ + "Contrast" + ] + }, + { + "fragment_id": "imported-from-markdown::channel-coding::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Channel Coding", + "text": "Explain how channel coding adds redundancy to protect messages from noise.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::channel-coding::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Channel Coding", + "text": "Contrast uncoded transmission with coded transmission on a noisy channel.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::error-correcting-codes::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Error Correcting Codes", + "text": "- Objective: Explain how error-correcting codes detect or repair corrupted symbols.\n- Exercise: Describe a simple parity-style code and its limits.\nThe learner must connect abstract limits to concrete coding mechanisms and understand both strengths and failure modes.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain how error-correcting codes detect or repair corrupted symbols." + ], + "exercises": [ + "Describe a simple parity-style code and its limits." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::error-correcting-codes::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Error Correcting Codes", + "text": "Explain how error-correcting codes detect or repair corrupted symbols.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::error-correcting-codes::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Error Correcting Codes", + "text": "Describe a simple parity-style code and its limits.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::cryptography-and-information-hiding::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Cryptography and Information Hiding", + "text": "- Objective: Explain the relationship between secrecy, information leakage, and coded communication.\n- Exercise: Compare a secure scheme with a weak one in terms of revealed information.\nThe course extends information-theoretic reasoning to adversarial settings where controlling what an observer can infer becomes central.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain the relationship between secrecy, information leakage, and coded communication." + ], + "exercises": [ + "Compare a secure scheme with a weak one in terms of revealed information." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::cryptography-and-information-hiding::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Cryptography and Information Hiding", + "text": "Explain the relationship between secrecy, information leakage, and coded communication.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::cryptography-and-information-hiding::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Cryptography and Information Hiding", + "text": "Compare a secure scheme with a weak one in terms of revealed information.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::thermodynamics-and-entropy::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Thermodynamics and Entropy", + "text": "- Objective: Explain how thermodynamic entropy relates to, and differs from, Shannon entropy.\n- Exercise: Compare the two entropy notions and identify what is preserved across the analogy.\nThe course uses entropy as a bridge concept between communication theory and physics while insisting on careful interpretation.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain how thermodynamic entropy relates to, and differs from, Shannon entropy." + ], + "exercises": [ + "Compare the two entropy notions and identify what is preserved across the analogy." + ], + "key_terms": [ + "Shannon" + ] + }, + { + "fragment_id": "imported-from-markdown::thermodynamics-and-entropy::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Thermodynamics and Entropy", + "text": "Explain how thermodynamic entropy relates to, and differs from, Shannon entropy.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::thermodynamics-and-entropy::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Thermodynamics and Entropy", + "text": "Compare the two entropy notions and identify what is preserved across the analogy.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::reversible-computation-and-quantum-computation::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Reversible Computation and Quantum Computation", + "text": "- Objective: Explain why the physical implementation of computation matters for information processing limits.\n- Exercise: Summarize how reversible computation changes the discussion of dissipation and information loss.\nLater units connect information, entropy, and computation more directly by considering reversible logic, irreversibility, and quantum information themes.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Explain why the physical implementation of computation matters for information processing limits." + ], + "exercises": [ + "Summarize how reversible computation changes the discussion of dissipation and information loss." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::reversible-computation-and-quantum-computation::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Reversible Computation and Quantum Computation", + "text": "Explain why the physical implementation of computation matters for information processing limits.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::reversible-computation-and-quantum-computation::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Reversible Computation and Quantum Computation", + "text": "Summarize how reversible computation changes the discussion of dissipation and information loss.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::course-synthesis::body", + "kind": "lesson_body", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Course Synthesis", + "text": "- Objective: Synthesize the course by connecting entropy, coding, reliability, secrecy, and physical interpretation in one coherent narrative.\n- Exercise: Produce a final study guide that links source coding, channel coding, secrecy, thermodynamic analogies, and computation.\nThe end of the course asks the learner to unify the mathematical and physical perspectives rather than treating the units as disconnected topics.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ], + "objectives": [ + "Synthesize the course by connecting entropy, coding, reliability, secrecy, and physical interpretation in one coherent narrative." + ], + "exercises": [ + "Produce a final study guide that links source coding, channel coding, secrecy, thermodynamic analogies, and computation." + ], + "key_terms": [] + }, + { + "fragment_id": "imported-from-markdown::course-synthesis::objective-1", + "kind": "objective", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Course Synthesis", + "text": "Synthesize the course by connecting entropy, coding, reliability, secrecy, and physical interpretation in one coherent narrative.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + }, + { + "fragment_id": "imported-from-markdown::course-synthesis::exercise-1", + "kind": "exercise", + "module_title": "Imported from MARKDOWN", + "lesson_title": "Course Synthesis", + "text": "Produce a final study guide that links source coding, channel coding, secrecy, thermodynamic analogies, and computation.", + "source_refs": [ + "examples/ocw-information-entropy/course/unit-sequence.md" + ] + } + ] +} \ No newline at end of file diff --git a/skills/ocw-information-entropy-agent/assets/generated/pack/source_inventory.yaml b/skills/ocw-information-entropy-agent/assets/generated/pack/source_inventory.yaml index 8423411..adaa0e4 100644 --- a/skills/ocw-information-entropy-agent/assets/generated/pack/source_inventory.yaml +++ b/skills/ocw-information-entropy-agent/assets/generated/pack/source_inventory.yaml @@ -6,7 +6,20 @@ sources: creator: MIT OpenCourseWare license_id: CC BY-NC-SA 4.0 license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/ - retrieved_at: "2026-03-14" + retrieved_at: "2026-03-16" + adapted: true + attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0. + excluded_from_upstream_license: false + exclusion_notes: "" + + - source_id: mit-ocw-6-050j-syllabus + title: MIT OpenCourseWare 6.050J Information and Entropy syllabus + url: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/pages/syllabus/ + publisher: Massachusetts Institute of Technology + creator: MIT OpenCourseWare + license_id: CC BY-NC-SA 4.0 + license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/ + retrieved_at: "2026-03-16" adapted: true attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0. excluded_from_upstream_license: false @@ -19,7 +32,7 @@ sources: creator: MIT OpenCourseWare license_id: CC BY-NC-SA 4.0 license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/ - retrieved_at: "2026-03-14" + retrieved_at: "2026-03-16" adapted: true attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0. excluded_from_upstream_license: false @@ -32,7 +45,7 @@ sources: creator: MIT OpenCourseWare license_id: CC BY-NC-SA 4.0 license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/ - retrieved_at: "2026-03-14" + retrieved_at: "2026-03-16" adapted: true attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0. excluded_from_upstream_license: false diff --git a/skills/ocw-information-entropy-agent/assets/generated/run/artifact_manifest.json b/skills/ocw-information-entropy-agent/assets/generated/run/artifact_manifest.json index 07fe9c9..dda9aab 100644 --- a/skills/ocw-information-entropy-agent/assets/generated/run/artifact_manifest.json +++ b/skills/ocw-information-entropy-agent/assets/generated/run/artifact_manifest.json @@ -3,9 +3,54 @@ "domain": "MIT OCW Information and Entropy", "artifacts": [ { - "concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy", + "concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home", "artifact_type": "symbolic", - "artifact_name": "mit-ocw-6-050j-information-and-entropy.md" + "artifact_name": "mit-ocw-6-050j-information-and-entropy-course-home.md" + }, + { + "concept": "mit-ocw-information-and-entropy::information-and-entropy", + "artifact_type": "symbolic", + "artifact_name": "information-and-entropy.md" + }, + { + "concept": "mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation", + "artifact_type": "symbolic", + "artifact_name": "ultimate-limits-to-communication-and-computation.md" + }, + { + "concept": "mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work", + "artifact_type": "symbolic", + "artifact_name": "open-textbooks-problem-sets-and-programming-work.md" + }, + { + "concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus", + "artifact_type": "symbolic", + "artifact_name": "mit-ocw-6-050j-information-and-entropy-syllabus.md" + }, + { + "concept": "mit-ocw-information-and-entropy::prerequisites-and-mathematical-background", + "artifact_type": "symbolic", + "artifact_name": "prerequisites-and-mathematical-background.md" + }, + { + "concept": "mit-ocw-information-and-entropy::assessment-structure", + "artifact_type": "symbolic", + "artifact_name": "assessment-structure.md" + }, + { + "concept": "mit-ocw-information-and-entropy::course-notes-and-reference-texts", + "artifact_type": "symbolic", + "artifact_name": "course-notes-and-reference-texts.md" + }, + { + "concept": "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison", + "artifact_type": "symbolic", + "artifact_name": "independent-reasoning-and-careful-comparison.md" + }, + { + "concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence", + "artifact_type": "symbolic", + "artifact_name": "mit-ocw-6-050j-information-and-entropy-unit-sequence.md" }, { "concept": "mit-ocw-information-and-entropy::counting-and-probability", @@ -23,9 +68,9 @@ "artifact_name": "mutual-information.md" }, { - "concept": "mit-ocw-information-and-entropy::data-compression", + "concept": "mit-ocw-information-and-entropy::source-coding-and-compression", "artifact_type": "symbolic", - "artifact_name": "data-compression.md" + "artifact_name": "source-coding-and-compression.md" }, { "concept": "mit-ocw-information-and-entropy::huffman-coding", diff --git a/skills/ocw-information-entropy-agent/assets/generated/run/capability_profile.json b/skills/ocw-information-entropy-agent/assets/generated/run/capability_profile.json index 1bf670d..461588e 100644 --- a/skills/ocw-information-entropy-agent/assets/generated/run/capability_profile.json +++ b/skills/ocw-information-entropy-agent/assets/generated/run/capability_profile.json @@ -3,24 +3,42 @@ "display_name": "OCW Information Entropy Agent", "domain": "MIT OCW Information and Entropy", "mastered_concepts": [ + "mit-ocw-information-and-entropy::assessment-structure", "mit-ocw-information-and-entropy::channel-capacity", "mit-ocw-information-and-entropy::channel-coding", "mit-ocw-information-and-entropy::counting-and-probability", + "mit-ocw-information-and-entropy::course-notes-and-reference-texts", "mit-ocw-information-and-entropy::cryptography-and-information-hiding", - "mit-ocw-information-and-entropy::data-compression", "mit-ocw-information-and-entropy::error-correcting-codes", "mit-ocw-information-and-entropy::huffman-coding", - "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy", + "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison", + "mit-ocw-information-and-entropy::information-and-entropy", + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home", + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus", + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence", "mit-ocw-information-and-entropy::mutual-information", + "mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work", + "mit-ocw-information-and-entropy::prerequisites-and-mathematical-background", "mit-ocw-information-and-entropy::shannon-entropy", - "mit-ocw-information-and-entropy::thermodynamics-and-entropy" + "mit-ocw-information-and-entropy::source-coding-and-compression", + "mit-ocw-information-and-entropy::thermodynamics-and-entropy", + "mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation" ], "weak_dimensions_by_concept": { - "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy": [], + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home": [], + "mit-ocw-information-and-entropy::information-and-entropy": [], + "mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation": [], + "mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work": [], + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus": [], + "mit-ocw-information-and-entropy::prerequisites-and-mathematical-background": [], + "mit-ocw-information-and-entropy::assessment-structure": [], + "mit-ocw-information-and-entropy::course-notes-and-reference-texts": [], + "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison": [], + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence": [], "mit-ocw-information-and-entropy::counting-and-probability": [], "mit-ocw-information-and-entropy::shannon-entropy": [], "mit-ocw-information-and-entropy::mutual-information": [], - "mit-ocw-information-and-entropy::data-compression": [], + "mit-ocw-information-and-entropy::source-coding-and-compression": [], "mit-ocw-information-and-entropy::huffman-coding": [], "mit-ocw-information-and-entropy::channel-capacity": [], "mit-ocw-information-and-entropy::channel-coding": [], @@ -29,7 +47,52 @@ "mit-ocw-information-and-entropy::thermodynamics-and-entropy": [] }, "evaluator_summary_by_concept": { - "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy": { + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home": { + "correctness": 0.8400000000000001, + "explanation": 0.85, + "critique": 0.7999999999999999 + }, + "mit-ocw-information-and-entropy::information-and-entropy": { + "correctness": 0.8400000000000001, + "explanation": 0.85, + "critique": 0.7999999999999999 + }, + "mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation": { + "correctness": 0.8400000000000001, + "explanation": 0.85, + "critique": 0.7999999999999999 + }, + "mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work": { + "correctness": 0.8400000000000001, + "explanation": 0.85, + "critique": 0.7999999999999999 + }, + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus": { + "correctness": 0.8400000000000001, + "explanation": 0.85, + "critique": 0.7999999999999999 + }, + "mit-ocw-information-and-entropy::prerequisites-and-mathematical-background": { + "correctness": 0.8400000000000001, + "explanation": 0.85, + "critique": 0.7999999999999999 + }, + "mit-ocw-information-and-entropy::assessment-structure": { + "correctness": 0.8400000000000001, + "explanation": 0.85, + "critique": 0.7999999999999999 + }, + "mit-ocw-information-and-entropy::course-notes-and-reference-texts": { + "correctness": 0.8400000000000001, + "explanation": 0.85, + "critique": 0.7999999999999999 + }, + "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison": { + "correctness": 0.8400000000000001, + "explanation": 0.85, + "critique": 0.7999999999999999 + }, + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence": { "correctness": 0.8400000000000001, "explanation": 0.85, "critique": 0.7999999999999999 @@ -49,7 +112,7 @@ "explanation": 0.85, "critique": 0.7999999999999999 }, - "mit-ocw-information-and-entropy::data-compression": { + "mit-ocw-information-and-entropy::source-coding-and-compression": { "correctness": 0.8400000000000001, "explanation": 0.85, "critique": 0.7999999999999999 @@ -87,9 +150,54 @@ }, "artifacts": [ { - "concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy", + "concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home", "artifact_type": "symbolic", - "artifact_name": "mit-ocw-6-050j-information-and-entropy.md" + "artifact_name": "mit-ocw-6-050j-information-and-entropy-course-home.md" + }, + { + "concept": "mit-ocw-information-and-entropy::information-and-entropy", + "artifact_type": "symbolic", + "artifact_name": "information-and-entropy.md" + }, + { + "concept": "mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation", + "artifact_type": "symbolic", + "artifact_name": "ultimate-limits-to-communication-and-computation.md" + }, + { + "concept": "mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work", + "artifact_type": "symbolic", + "artifact_name": "open-textbooks-problem-sets-and-programming-work.md" + }, + { + "concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus", + "artifact_type": "symbolic", + "artifact_name": "mit-ocw-6-050j-information-and-entropy-syllabus.md" + }, + { + "concept": "mit-ocw-information-and-entropy::prerequisites-and-mathematical-background", + "artifact_type": "symbolic", + "artifact_name": "prerequisites-and-mathematical-background.md" + }, + { + "concept": "mit-ocw-information-and-entropy::assessment-structure", + "artifact_type": "symbolic", + "artifact_name": "assessment-structure.md" + }, + { + "concept": "mit-ocw-information-and-entropy::course-notes-and-reference-texts", + "artifact_type": "symbolic", + "artifact_name": "course-notes-and-reference-texts.md" + }, + { + "concept": "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison", + "artifact_type": "symbolic", + "artifact_name": "independent-reasoning-and-careful-comparison.md" + }, + { + "concept": "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence", + "artifact_type": "symbolic", + "artifact_name": "mit-ocw-6-050j-information-and-entropy-unit-sequence.md" }, { "concept": "mit-ocw-information-and-entropy::counting-and-probability", @@ -107,9 +215,9 @@ "artifact_name": "mutual-information.md" }, { - "concept": "mit-ocw-information-and-entropy::data-compression", + "concept": "mit-ocw-information-and-entropy::source-coding-and-compression", "artifact_type": "symbolic", - "artifact_name": "data-compression.md" + "artifact_name": "source-coding-and-compression.md" }, { "concept": "mit-ocw-information-and-entropy::huffman-coding", diff --git a/skills/ocw-information-entropy-agent/assets/generated/run/capability_report.md b/skills/ocw-information-entropy-agent/assets/generated/run/capability_report.md index 9c65e6a..ee56f97 100644 --- a/skills/ocw-information-entropy-agent/assets/generated/run/capability_report.md +++ b/skills/ocw-information-entropy-agent/assets/generated/run/capability_report.md @@ -4,19 +4,34 @@ - Domain: `MIT OCW Information and Entropy` ## Mastered Concepts +- mit-ocw-information-and-entropy::assessment-structure - mit-ocw-information-and-entropy::channel-capacity - mit-ocw-information-and-entropy::channel-coding - mit-ocw-information-and-entropy::counting-and-probability +- mit-ocw-information-and-entropy::course-notes-and-reference-texts - mit-ocw-information-and-entropy::cryptography-and-information-hiding -- mit-ocw-information-and-entropy::data-compression - mit-ocw-information-and-entropy::error-correcting-codes - mit-ocw-information-and-entropy::huffman-coding -- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy +- mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison +- mit-ocw-information-and-entropy::information-and-entropy +- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home +- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus +- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence - mit-ocw-information-and-entropy::mutual-information +- mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work +- mit-ocw-information-and-entropy::prerequisites-and-mathematical-background - mit-ocw-information-and-entropy::shannon-entropy +- mit-ocw-information-and-entropy::source-coding-and-compression - mit-ocw-information-and-entropy::thermodynamics-and-entropy +- mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation ## Concept Summaries +### mit-ocw-information-and-entropy::assessment-structure +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + ### mit-ocw-information-and-entropy::channel-capacity - correctness: 0.84 - critique: 0.80 @@ -35,13 +50,13 @@ - explanation: 0.85 - weak dimensions: none -### mit-ocw-information-and-entropy::cryptography-and-information-hiding +### mit-ocw-information-and-entropy::course-notes-and-reference-texts - correctness: 0.84 - critique: 0.80 - explanation: 0.85 - weak dimensions: none -### mit-ocw-information-and-entropy::data-compression +### mit-ocw-information-and-entropy::cryptography-and-information-hiding - correctness: 0.84 - critique: 0.80 - explanation: 0.85 @@ -59,7 +74,31 @@ - explanation: 0.85 - weak dimensions: none -### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy +### mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + +### mit-ocw-information-and-entropy::information-and-entropy +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + +### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + +### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + +### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence - correctness: 0.84 - critique: 0.80 - explanation: 0.85 @@ -71,24 +110,57 @@ - explanation: 0.85 - weak dimensions: none +### mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + +### mit-ocw-information-and-entropy::prerequisites-and-mathematical-background +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + ### mit-ocw-information-and-entropy::shannon-entropy - correctness: 0.84 - critique: 0.80 - explanation: 0.85 - weak dimensions: none +### mit-ocw-information-and-entropy::source-coding-and-compression +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + ### mit-ocw-information-and-entropy::thermodynamics-and-entropy - correctness: 0.84 - critique: 0.80 - explanation: 0.85 - weak dimensions: none +### mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + ## Artifacts -- mit-ocw-6-050j-information-and-entropy.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy +- mit-ocw-6-050j-information-and-entropy-course-home.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home +- information-and-entropy.md (symbolic) for mit-ocw-information-and-entropy::information-and-entropy +- ultimate-limits-to-communication-and-computation.md (symbolic) for mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation +- open-textbooks-problem-sets-and-programming-work.md (symbolic) for mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work +- mit-ocw-6-050j-information-and-entropy-syllabus.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus +- prerequisites-and-mathematical-background.md (symbolic) for mit-ocw-information-and-entropy::prerequisites-and-mathematical-background +- assessment-structure.md (symbolic) for mit-ocw-information-and-entropy::assessment-structure +- course-notes-and-reference-texts.md (symbolic) for mit-ocw-information-and-entropy::course-notes-and-reference-texts +- independent-reasoning-and-careful-comparison.md (symbolic) for mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison +- mit-ocw-6-050j-information-and-entropy-unit-sequence.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence - counting-and-probability.md (symbolic) for mit-ocw-information-and-entropy::counting-and-probability - shannon-entropy.md (symbolic) for mit-ocw-information-and-entropy::shannon-entropy - mutual-information.md (symbolic) for mit-ocw-information-and-entropy::mutual-information -- data-compression.md (symbolic) for mit-ocw-information-and-entropy::data-compression +- source-coding-and-compression.md (symbolic) for mit-ocw-information-and-entropy::source-coding-and-compression - huffman-coding.md (symbolic) for mit-ocw-information-and-entropy::huffman-coding - channel-capacity.md (symbolic) for mit-ocw-information-and-entropy::channel-capacity - channel-coding.md (symbolic) for mit-ocw-information-and-entropy::channel-coding diff --git a/skills/ocw-information-entropy-agent/assets/generated/run/run_summary.json b/skills/ocw-information-entropy-agent/assets/generated/run/run_summary.json index 0e67d87..a5d9609 100644 --- a/skills/ocw-information-entropy-agent/assets/generated/run/run_summary.json +++ b/skills/ocw-information-entropy-agent/assets/generated/run/run_summary.json @@ -1,75 +1,32 @@ { - "course_source": "/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy/6-050j-information-and-entropy.md", - "pack_dir": "/home/netuser/dev/Didactopustry1/domain-packs/mit-ocw-information-entropy", - "skill_dir": "/home/netuser/dev/Didactopustry1/skills/ocw-information-entropy-agent", - "source_inventory": "/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy/sources.yaml", + "course_source": "examples/ocw-information-entropy/course", + "source_document_count": 3, + "pack_dir": "domain-packs/mit-ocw-information-entropy", + "skill_dir": "skills/ocw-information-entropy-agent", + "source_inventory": "examples/ocw-information-entropy/sources.yaml", "review_flags": [ - "Module 'Imported from MARKDOWN' has no explicit exercises; mastery signals may be weak.", - "Concept 'MIT OCW 6.050J Information and Entropy' has no extracted mastery signals; review manually.", - "Concept 'Information' has no extracted mastery signals; review manually.", - "Concept 'Entropy' has no extracted mastery signals; review manually.", - "Concept 'Source' has no extracted mastery signals; review manually.", - "Concept 'OpenCourseWare' has no extracted mastery signals; review manually.", - "Concept 'Spring' has no extracted mastery signals; review manually.", - "Concept 'Attribution' has no extracted mastery signals; review manually.", - "Concept 'Counting and Probability' has no extracted mastery signals; review manually.", - "Concept 'Counting' has no extracted mastery signals; review manually.", - "Concept 'Probability' has no extracted mastery signals; review manually.", - "Concept 'Objective' has no extracted mastery signals; review manually.", - "Concept 'Explain' has no extracted mastery signals; review manually.", - "Concept 'Exercise' has no extracted mastery signals; review manually.", - "Concept 'Derive' has no extracted mastery signals; review manually.", - "Concept 'This' has no extracted mastery signals; review manually.", - "Concept 'Random' has no extracted mastery signals; review manually.", - "Concept 'Shannon Entropy' has no extracted mastery signals; review manually.", - "Concept 'Shannon' has no extracted mastery signals; review manually.", - "Concept 'Compute' has no extracted mastery signals; review manually.", - "Concept 'Bernoulli' has no extracted mastery signals; review manually.", - "Concept 'Mutual Information' has no extracted mastery signals; review manually.", - "Concept 'Mutual' has no extracted mastery signals; review manually.", - "Concept 'Compare' has no extracted mastery signals; review manually.", - "Concept 'Dependence' has no extracted mastery signals; review manually.", - "Concept 'Data Compression' has no extracted mastery signals; review manually.", - "Concept 'Data' has no extracted mastery signals; review manually.", - "Concept 'Compression' has no extracted mastery signals; review manually.", - "Concept 'Describe' has no extracted mastery signals; review manually.", - "Concept 'Redundancy' has no extracted mastery signals; review manually.", - "Concept 'Huffman Coding' has no extracted mastery signals; review manually.", - "Concept 'Huffman' has no extracted mastery signals; review manually.", - "Concept 'Coding' has no extracted mastery signals; review manually.", - "Concept 'Build' has no extracted mastery signals; review manually.", - "Concept 'Prefix' has no extracted mastery signals; review manually.", - "Concept 'Channel Capacity' has no extracted mastery signals; review manually.", - "Concept 'Channel' has no extracted mastery signals; review manually.", - "Concept 'Capacity' has no extracted mastery signals; review manually.", - "Concept 'State' has no extracted mastery signals; review manually.", - "Concept 'Reliable' has no extracted mastery signals; review manually.", - "Concept 'Channel Coding' has no extracted mastery signals; review manually.", - "Concept 'Contrast' has no extracted mastery signals; review manually.", - "Concept 'Decoding' has no extracted mastery signals; review manually.", - "Concept 'Error Correcting Codes' has no extracted mastery signals; review manually.", - "Concept 'Error' has no extracted mastery signals; review manually.", - "Concept 'Correcting' has no extracted mastery signals; review manually.", - "Concept 'Codes' has no extracted mastery signals; review manually.", - "Concept 'Cryptography and Information Hiding' has no extracted mastery signals; review manually.", - "Concept 'Cryptography' has no extracted mastery signals; review manually.", - "Concept 'Hiding' has no extracted mastery signals; review manually.", - "Concept 'Thermodynamics and Entropy' has no extracted mastery signals; review manually.", - "Concept 'Thermodynamics' has no extracted mastery signals; review manually.", - "Concept 'Course Synthesis' has no extracted mastery signals; review manually.", - "Concept 'Course' has no extracted mastery signals; review manually.", - "Concept 'Synthesis' has no extracted mastery signals; review manually.", - "Concept 'Synthesize' has no extracted mastery signals; review manually.", - "Concept 'Produce' has no extracted mastery signals; review manually." + "Concept 'MIT OCW 6.050J Information and Entropy: Course Home' has no extracted mastery signals; review manually.", + "Concept 'MIT OCW 6.050J Information and Entropy: Syllabus' has no extracted mastery signals; review manually.", + "Concept 'MIT OCW 6.050J Information and Entropy: Unit Sequence' has no extracted mastery signals; review manually." ], - "concept_count": 56, + "concept_count": 34, + "source_fragment_count": 60, "target_concept": "mit-ocw-information-and-entropy::thermodynamics-and-entropy", "curriculum_path": [ - "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy", + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home", + "mit-ocw-information-and-entropy::information-and-entropy", + "mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation", + "mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work", + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus", + "mit-ocw-information-and-entropy::prerequisites-and-mathematical-background", + "mit-ocw-information-and-entropy::assessment-structure", + "mit-ocw-information-and-entropy::course-notes-and-reference-texts", + "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison", + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence", "mit-ocw-information-and-entropy::counting-and-probability", "mit-ocw-information-and-entropy::shannon-entropy", "mit-ocw-information-and-entropy::mutual-information", - "mit-ocw-information-and-entropy::data-compression", + "mit-ocw-information-and-entropy::source-coding-and-compression", "mit-ocw-information-and-entropy::huffman-coding", "mit-ocw-information-and-entropy::channel-capacity", "mit-ocw-information-and-entropy::channel-coding", @@ -78,25 +35,35 @@ "mit-ocw-information-and-entropy::thermodynamics-and-entropy" ], "mastered_concepts": [ + "mit-ocw-information-and-entropy::assessment-structure", "mit-ocw-information-and-entropy::channel-capacity", "mit-ocw-information-and-entropy::channel-coding", "mit-ocw-information-and-entropy::counting-and-probability", + "mit-ocw-information-and-entropy::course-notes-and-reference-texts", "mit-ocw-information-and-entropy::cryptography-and-information-hiding", - "mit-ocw-information-and-entropy::data-compression", "mit-ocw-information-and-entropy::error-correcting-codes", "mit-ocw-information-and-entropy::huffman-coding", - "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy", + "mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison", + "mit-ocw-information-and-entropy::information-and-entropy", + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home", + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus", + "mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence", "mit-ocw-information-and-entropy::mutual-information", + "mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work", + "mit-ocw-information-and-entropy::prerequisites-and-mathematical-background", "mit-ocw-information-and-entropy::shannon-entropy", - "mit-ocw-information-and-entropy::thermodynamics-and-entropy" + "mit-ocw-information-and-entropy::source-coding-and-compression", + "mit-ocw-information-and-entropy::thermodynamics-and-entropy", + "mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation" ], - "artifact_count": 11, - "compliance_manifest": "/home/netuser/dev/Didactopustry1/domain-packs/mit-ocw-information-entropy/pack_compliance_manifest.json", + "artifact_count": 20, + "compliance_manifest": "domain-packs/mit-ocw-information-entropy/pack_compliance_manifest.json", "compliance": { "pack_id": "mit-ocw-information-and-entropy", "display_name": "MIT OCW Information and Entropy", "derived_from_sources": [ "mit-ocw-6-050j-course-home", + "mit-ocw-6-050j-syllabus", "mit-ocw-6-050j-unit-8-textbook", "mit-ocw-6-050j-unit-13-textbook" ], diff --git a/skills/ocw-information-entropy-agent/references/generated-capability-summary.md b/skills/ocw-information-entropy-agent/references/generated-capability-summary.md index 9c65e6a..ee56f97 100644 --- a/skills/ocw-information-entropy-agent/references/generated-capability-summary.md +++ b/skills/ocw-information-entropy-agent/references/generated-capability-summary.md @@ -4,19 +4,34 @@ - Domain: `MIT OCW Information and Entropy` ## Mastered Concepts +- mit-ocw-information-and-entropy::assessment-structure - mit-ocw-information-and-entropy::channel-capacity - mit-ocw-information-and-entropy::channel-coding - mit-ocw-information-and-entropy::counting-and-probability +- mit-ocw-information-and-entropy::course-notes-and-reference-texts - mit-ocw-information-and-entropy::cryptography-and-information-hiding -- mit-ocw-information-and-entropy::data-compression - mit-ocw-information-and-entropy::error-correcting-codes - mit-ocw-information-and-entropy::huffman-coding -- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy +- mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison +- mit-ocw-information-and-entropy::information-and-entropy +- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home +- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus +- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence - mit-ocw-information-and-entropy::mutual-information +- mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work +- mit-ocw-information-and-entropy::prerequisites-and-mathematical-background - mit-ocw-information-and-entropy::shannon-entropy +- mit-ocw-information-and-entropy::source-coding-and-compression - mit-ocw-information-and-entropy::thermodynamics-and-entropy +- mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation ## Concept Summaries +### mit-ocw-information-and-entropy::assessment-structure +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + ### mit-ocw-information-and-entropy::channel-capacity - correctness: 0.84 - critique: 0.80 @@ -35,13 +50,13 @@ - explanation: 0.85 - weak dimensions: none -### mit-ocw-information-and-entropy::cryptography-and-information-hiding +### mit-ocw-information-and-entropy::course-notes-and-reference-texts - correctness: 0.84 - critique: 0.80 - explanation: 0.85 - weak dimensions: none -### mit-ocw-information-and-entropy::data-compression +### mit-ocw-information-and-entropy::cryptography-and-information-hiding - correctness: 0.84 - critique: 0.80 - explanation: 0.85 @@ -59,7 +74,31 @@ - explanation: 0.85 - weak dimensions: none -### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy +### mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + +### mit-ocw-information-and-entropy::information-and-entropy +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + +### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + +### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + +### mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence - correctness: 0.84 - critique: 0.80 - explanation: 0.85 @@ -71,24 +110,57 @@ - explanation: 0.85 - weak dimensions: none +### mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + +### mit-ocw-information-and-entropy::prerequisites-and-mathematical-background +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + ### mit-ocw-information-and-entropy::shannon-entropy - correctness: 0.84 - critique: 0.80 - explanation: 0.85 - weak dimensions: none +### mit-ocw-information-and-entropy::source-coding-and-compression +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + ### mit-ocw-information-and-entropy::thermodynamics-and-entropy - correctness: 0.84 - critique: 0.80 - explanation: 0.85 - weak dimensions: none +### mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation +- correctness: 0.84 +- critique: 0.80 +- explanation: 0.85 +- weak dimensions: none + ## Artifacts -- mit-ocw-6-050j-information-and-entropy.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy +- mit-ocw-6-050j-information-and-entropy-course-home.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home +- information-and-entropy.md (symbolic) for mit-ocw-information-and-entropy::information-and-entropy +- ultimate-limits-to-communication-and-computation.md (symbolic) for mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation +- open-textbooks-problem-sets-and-programming-work.md (symbolic) for mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work +- mit-ocw-6-050j-information-and-entropy-syllabus.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus +- prerequisites-and-mathematical-background.md (symbolic) for mit-ocw-information-and-entropy::prerequisites-and-mathematical-background +- assessment-structure.md (symbolic) for mit-ocw-information-and-entropy::assessment-structure +- course-notes-and-reference-texts.md (symbolic) for mit-ocw-information-and-entropy::course-notes-and-reference-texts +- independent-reasoning-and-careful-comparison.md (symbolic) for mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison +- mit-ocw-6-050j-information-and-entropy-unit-sequence.md (symbolic) for mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence - counting-and-probability.md (symbolic) for mit-ocw-information-and-entropy::counting-and-probability - shannon-entropy.md (symbolic) for mit-ocw-information-and-entropy::shannon-entropy - mutual-information.md (symbolic) for mit-ocw-information-and-entropy::mutual-information -- data-compression.md (symbolic) for mit-ocw-information-and-entropy::data-compression +- source-coding-and-compression.md (symbolic) for mit-ocw-information-and-entropy::source-coding-and-compression - huffman-coding.md (symbolic) for mit-ocw-information-and-entropy::huffman-coding - channel-capacity.md (symbolic) for mit-ocw-information-and-entropy::channel-capacity - channel-coding.md (symbolic) for mit-ocw-information-and-entropy::channel-coding diff --git a/skills/ocw-information-entropy-agent/references/generated-course-summary.md b/skills/ocw-information-entropy-agent/references/generated-course-summary.md index 44b8fd8..1b7dbf2 100644 --- a/skills/ocw-information-entropy-agent/references/generated-course-summary.md +++ b/skills/ocw-information-entropy-agent/references/generated-course-summary.md @@ -1,14 +1,23 @@ # Generated Course Summary -- Pack dir: `/home/netuser/dev/Didactopustry1/domain-packs/mit-ocw-information-entropy` -- Run dir: `/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy-run` +- Pack dir: `domain-packs/mit-ocw-information-entropy` +- Run dir: `examples/ocw-information-entropy-run` ## Curriculum Path Used By The Demo Learner -- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy +- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home +- mit-ocw-information-and-entropy::information-and-entropy +- mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation +- mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work +- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus +- mit-ocw-information-and-entropy::prerequisites-and-mathematical-background +- mit-ocw-information-and-entropy::assessment-structure +- mit-ocw-information-and-entropy::course-notes-and-reference-texts +- mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison +- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence - mit-ocw-information-and-entropy::counting-and-probability - mit-ocw-information-and-entropy::shannon-entropy - mit-ocw-information-and-entropy::mutual-information -- mit-ocw-information-and-entropy::data-compression +- mit-ocw-information-and-entropy::source-coding-and-compression - mit-ocw-information-and-entropy::huffman-coding - mit-ocw-information-and-entropy::channel-capacity - mit-ocw-information-and-entropy::channel-coding @@ -17,14 +26,23 @@ - mit-ocw-information-and-entropy::thermodynamics-and-entropy ## Mastered Concepts +- mit-ocw-information-and-entropy::assessment-structure - mit-ocw-information-and-entropy::channel-capacity - mit-ocw-information-and-entropy::channel-coding - mit-ocw-information-and-entropy::counting-and-probability +- mit-ocw-information-and-entropy::course-notes-and-reference-texts - mit-ocw-information-and-entropy::cryptography-and-information-hiding -- mit-ocw-information-and-entropy::data-compression - mit-ocw-information-and-entropy::error-correcting-codes - mit-ocw-information-and-entropy::huffman-coding -- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy +- mit-ocw-information-and-entropy::independent-reasoning-and-careful-comparison +- mit-ocw-information-and-entropy::information-and-entropy +- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-course-home +- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-syllabus +- mit-ocw-information-and-entropy::mit-ocw-6-050j-information-and-entropy-unit-sequence - mit-ocw-information-and-entropy::mutual-information +- mit-ocw-information-and-entropy::open-textbooks-problem-sets-and-programming-work +- mit-ocw-information-and-entropy::prerequisites-and-mathematical-background - mit-ocw-information-and-entropy::shannon-entropy -- mit-ocw-information-and-entropy::thermodynamics-and-entropy \ No newline at end of file +- mit-ocw-information-and-entropy::source-coding-and-compression +- mit-ocw-information-and-entropy::thermodynamics-and-entropy +- mit-ocw-information-and-entropy::ultimate-limits-to-communication-and-computation \ No newline at end of file diff --git a/src/didactopus/config.py b/src/didactopus/config.py index e103d3b..379e3b2 100644 --- a/src/didactopus/config.py +++ b/src/didactopus/config.py @@ -45,10 +45,38 @@ class PlatformConfig(BaseModel): return self.dimension_thresholds +class LocalProviderConfig(BaseModel): + backend: str = "stub" + model_name: str = "local-demo" + + +class RoleMeshProviderConfig(BaseModel): + base_url: str = os.getenv("DIDACTOPUS_ROLEMESH_BASE_URL", "http://127.0.0.1:8000") + api_key: str = os.getenv("DIDACTOPUS_ROLEMESH_API_KEY", "") + default_model: str = "planner" + role_to_model: dict[str, str] = Field( + default_factory=lambda: { + "mentor": "planner", + "learner": "writer", + "practice": "writer", + "project_advisor": "planner", + "evaluator": "reviewer", + } + ) + timeout_seconds: float = 30.0 + + +class ModelProviderConfig(BaseModel): + provider: str = "stub" + local: LocalProviderConfig = Field(default_factory=LocalProviderConfig) + rolemesh: RoleMeshProviderConfig = Field(default_factory=RoleMeshProviderConfig) + + class AppConfig(BaseModel): review: ReviewConfig = Field(default_factory=ReviewConfig) bridge: BridgeConfig = Field(default_factory=BridgeConfig) platform: PlatformConfig = Field(default_factory=PlatformConfig) + model_provider: ModelProviderConfig = Field(default_factory=ModelProviderConfig) def load_settings() -> Settings: @@ -64,4 +92,6 @@ def _with_platform_defaults(data: dict[str, Any]) -> dict[str, Any]: normalized = dict(data) if "platform" not in normalized: normalized["platform"] = {} + if "model_provider" not in normalized: + normalized["model_provider"] = {} return normalized diff --git a/src/didactopus/document_adapters.py b/src/didactopus/document_adapters.py index a759207..13c17c9 100644 --- a/src/didactopus/document_adapters.py +++ b/src/didactopus/document_adapters.py @@ -126,6 +126,19 @@ def detect_adapter(path: str | Path) -> str: return "text" +def is_supported_document(path: str | Path) -> bool: + p = Path(path) + return p.is_file() and detect_adapter(p) in {"markdown", "text", "html", "pdf", "docx", "pptx"} + + +def adapt_documents(path: str | Path) -> list[NormalizedDocument]: + p = Path(path) + if p.is_dir(): + docs = [adapt_document(child) for child in sorted(p.rglob("*")) if is_supported_document(child)] + return docs + return [adapt_document(p)] + + def adapt_document(path: str | Path) -> NormalizedDocument: adapter = detect_adapter(path) if adapter == "markdown": diff --git a/src/didactopus/mentor.py b/src/didactopus/mentor.py index 9ded6fd..11ed494 100644 --- a/src/didactopus/mentor.py +++ b/src/didactopus/mentor.py @@ -1,4 +1,5 @@ from .model_provider import ModelProvider +from .role_prompts import mentor_system_prompt def generate_socratic_prompt(provider: ModelProvider, concept: str, weak_dimensions: list[str] | None = None) -> str: @@ -6,5 +7,7 @@ def generate_socratic_prompt(provider: ModelProvider, concept: str, weak_dimensi if weak_dimensions: weak_text = f" Focus especially on weak dimensions: {', '.join(weak_dimensions)}." return provider.generate( - f"You are a Socratic mentor. Ask one probing question about '{concept}'.{weak_text}" + f"Ask one probing question about '{concept}'.{weak_text}", + role="mentor", + system_prompt=mentor_system_prompt(), ).text diff --git a/src/didactopus/model_provider.py b/src/didactopus/model_provider.py index 037a7ce..7c6e292 100644 --- a/src/didactopus/model_provider.py +++ b/src/didactopus/model_provider.py @@ -1,4 +1,10 @@ +from __future__ import annotations + from dataclasses import dataclass +import json +from typing import Callable +from urllib import request + from .config import ModelProviderConfig @@ -13,11 +19,92 @@ class ModelProvider: def __init__(self, config: ModelProviderConfig) -> None: self.config = config - def generate(self, prompt: str) -> ModelResponse: + def pending_notice(self, role: str | None, model_name: str | None = None) -> str: + label = role or "assistant" + notices = { + "mentor": "Didactopus is reviewing the next learning step before answering.", + "learner": "Didactopus is drafting the learner-side reflection now.", + "practice": "Didactopus is designing a practice task for you now.", + "project_advisor": "Didactopus is sketching a project direction now.", + "evaluator": "Didactopus is evaluating the work before replying.", + } + notice = notices.get(label, "Didactopus is preparing the next response.") + if model_name: + return f"{notice} Model: {model_name}." + return notice + + def generate( + self, + prompt: str, + role: str | None = None, + system_prompt: str | None = None, + temperature: float | None = None, + max_tokens: int | None = None, + status_callback: Callable[[str], None] | None = None, + ) -> ModelResponse: + provider_name = self.config.provider.lower() + if provider_name == "rolemesh": + return self._generate_rolemesh(prompt, role, system_prompt, temperature, max_tokens, status_callback) + return self._generate_stub(prompt, role) + + def _generate_stub(self, prompt: str, role: str | None) -> ModelResponse: local = self.config.local preview = prompt.strip().replace("\n", " ")[:120] + role_text = f"[{role}] " if role else "" return ModelResponse( - text=f"[stubbed-response] {preview}", + text=f"[stubbed-response] {role_text}{preview}", provider=local.backend, model_name=local.model_name, ) + + def _generate_rolemesh( + self, + prompt: str, + role: str | None, + system_prompt: str | None, + temperature: float | None, + max_tokens: int | None, + status_callback: Callable[[str], None] | None, + ) -> ModelResponse: + rolemesh = self.config.rolemesh + model_name = rolemesh.role_to_model.get(role or "", rolemesh.default_model) + if status_callback is not None: + status_callback(self.pending_notice(role, model_name)) + messages = [] + if system_prompt: + messages.append({"role": "system", "content": system_prompt}) + messages.append({"role": "user", "content": prompt}) + payload = { + "model": model_name, + "messages": messages, + } + if temperature is not None: + payload["temperature"] = temperature + if max_tokens is not None: + payload["max_tokens"] = max_tokens + body = self._rolemesh_chat_completion(payload) + choices = body.get("choices", []) + if not choices: + raise RuntimeError("RoleMesh returned no choices.") + message = choices[0].get("message", {}) + text = message.get("content", "") + if not isinstance(text, str): + text = str(text) + return ModelResponse(text=text, provider="rolemesh", model_name=model_name) + + def _rolemesh_chat_completion(self, payload: dict) -> dict: + rolemesh = self.config.rolemesh + url = rolemesh.base_url.rstrip("/") + "/v1/chat/completions" + headers = { + "Content-Type": "application/json", + } + if rolemesh.api_key: + headers["X-Api-Key"] = rolemesh.api_key + req = request.Request( + url, + data=json.dumps(payload).encode("utf-8"), + headers=headers, + method="POST", + ) + with request.urlopen(req, timeout=rolemesh.timeout_seconds) as response: + return json.loads(response.read().decode("utf-8")) diff --git a/src/didactopus/ocw_information_entropy_demo.py b/src/didactopus/ocw_information_entropy_demo.py index a2b18d7..e99605b 100644 --- a/src/didactopus/ocw_information_entropy_demo.py +++ b/src/didactopus/ocw_information_entropy_demo.py @@ -6,7 +6,7 @@ from pathlib import Path from .agentic_loop import AgenticStudentState, integrate_attempt from .artifact_registry import validate_pack from .course_ingestion_compliance import build_pack_compliance_manifest, load_sources, write_manifest -from .document_adapters import adapt_document +from .document_adapters import adapt_documents from .evaluator_pipeline import LearnerAttempt from .graph_builder import build_concept_graph from .mastery_ledger import ( @@ -15,7 +15,7 @@ from .mastery_ledger import ( export_capability_profile_json, export_capability_report_markdown, ) -from .pack_emitter import build_draft_pack, write_draft_pack +from .pack_emitter import build_draft_pack, write_draft_pack, write_source_corpus from .rule_policy import RuleContext, build_default_rules, run_rules from .topic_ingest import build_topic_bundle, document_to_course, extract_concept_candidates, merge_courses_into_topic_course @@ -38,8 +38,9 @@ Use this skill when the task is about tutoring, evaluating, or planning study in 1. Read `references/generated-course-summary.md` for the pack structure and target concepts. 2. Read `references/generated-capability-summary.md` to understand what the demo AI learner already mastered. 3. Use `assets/generated/pack/` as the source of truth for concept ids, prerequisites, and mastery signals. -4. When giving guidance, preserve the pack ordering from fundamentals through coding and thermodynamics. -5. When uncertain, say which concept or prerequisite in the generated pack is underspecified. +4. Use `assets/generated/pack/source_corpus.json` to ground explanations in the ingested source material before relying on model prior knowledge. +5. When giving guidance, preserve the pack ordering from fundamentals through coding and thermodynamics. +6. When uncertain, say which concept or prerequisite in the generated pack is underspecified and which source fragment would need review. ## Outputs @@ -122,6 +123,15 @@ def _write_skill_bundle( (run_asset_dir / source.name).write_text(source.read_text(encoding="utf-8"), encoding="utf-8") +def _select_target_concept(pack_name: str, concepts: list, preferred_id: str = "thermodynamics-and-entropy") -> str: + ids = [concept.id for concept in concepts] + if preferred_id in ids: + return f"{pack_name}::{preferred_id}" + if not ids: + raise ValueError("No concept candidates available to select as target.") + return f"{pack_name}::{ids[-1]}" + + def run_ocw_information_entropy_demo( course_source: str | Path, source_inventory: str | Path, @@ -135,9 +145,11 @@ def run_ocw_information_entropy_demo( run_dir = Path(run_dir) skill_dir = Path(skill_dir) - doc = adapt_document(course_source) - course = document_to_course(doc, "MIT OCW Information and Entropy") - merged = merge_courses_into_topic_course(build_topic_bundle(course.title, [course])) + docs = adapt_documents(course_source) + if not docs: + raise ValueError(f"No supported source documents found under {course_source}") + courses = [document_to_course(doc, "MIT OCW Information and Entropy") for doc in docs] + merged = merge_courses_into_topic_course(build_topic_bundle("MIT OCW Information and Entropy", courses)) merged.rights_note = DEFAULT_RIGHTS_NOTE concepts = extract_concept_candidates(merged) @@ -153,6 +165,7 @@ def run_ocw_information_entropy_demo( conflicts=[], ) write_draft_pack(draft, pack_dir) + write_source_corpus(merged, pack_dir) if source_inventory.exists(): inventory = load_sources(source_inventory) compliance_manifest = build_pack_compliance_manifest(draft.pack["name"], draft.pack["display_name"], inventory) @@ -170,7 +183,7 @@ def run_ocw_information_entropy_demo( "project_execution": 0.75, "critique": 0.7, }) - target_key = f"{draft.pack['name']}::thermodynamics-and-entropy" + target_key = _select_target_concept(draft.pack["name"], ctx.concepts) concept_path = graph.curriculum_path_to_target(set(), target_key) state = AgenticStudentState( @@ -189,11 +202,13 @@ def run_ocw_information_entropy_demo( summary = { "course_source": str(course_source), + "source_document_count": len(docs), "pack_dir": str(pack_dir), "skill_dir": str(skill_dir), "source_inventory": str(source_inventory), "review_flags": list(ctx.review_flags), "concept_count": len(ctx.concepts), + "source_fragment_count": len(json.loads((pack_dir / "source_corpus.json").read_text(encoding="utf-8")).get("fragments", [])), "target_concept": target_key, "curriculum_path": concept_path, "mastered_concepts": sorted(state.mastered_concepts), @@ -216,7 +231,7 @@ def main() -> None: parser = argparse.ArgumentParser(description="Generate a domain pack and skill bundle from MIT OCW Information and Entropy.") parser.add_argument( "--course-source", - default=str(root / "examples" / "ocw-information-entropy" / "6-050j-information-and-entropy.md"), + default=str(root / "examples" / "ocw-information-entropy" / "course"), ) parser.add_argument( "--source-inventory", diff --git a/src/didactopus/ocw_rolemesh_transcript_demo.py b/src/didactopus/ocw_rolemesh_transcript_demo.py new file mode 100644 index 0000000..0b104f8 --- /dev/null +++ b/src/didactopus/ocw_rolemesh_transcript_demo.py @@ -0,0 +1,451 @@ +from __future__ import annotations + +import json +from pathlib import Path +import sys + +from .config import load_config +from .model_provider import ModelProvider +from .ocw_skill_agent_demo import load_ocw_skill_context +from .role_prompts import evaluator_system_prompt, learner_system_prompt, mentor_system_prompt, practice_system_prompt + + +def _format_turn(role: str, speaker: str, content: str) -> dict[str, str]: + return {"role": role, "speaker": speaker, "content": content.strip()} + + +def _normalize_completion(text: str) -> str: + stripped = text.strip() + if not stripped: + return stripped + if _looks_truncated(stripped): + return stripped.rstrip(" ,;:-") + "." + return stripped + + +def _looks_truncated(text: str) -> bool: + def _ends_with_truncated_marker(line: str) -> bool: + lowered_line = line.lower() + return any(lowered_line.endswith(marker) for marker in truncated_markers) + + stripped = text.strip() + if not stripped: + return True + if stripped.endswith(("...", "…")): + return True + truncated_markers = ( + "for example,", + "for instance,", + "such as", + "this means", + "therefore", + "however", + "furthermore", + "in particular", + "suppose we have", + "which means", + "so the", + "is h =", + "compare the entropy of one roll with the", + "with a crossover", + ) + lowered = stripped.lower() + if _ends_with_truncated_marker(lowered): + return True + if stripped[-1] not in ".!?)]}\"'": + return True + lines = [line.strip() for line in stripped.splitlines() if line.strip()] + if len(lines) >= 2: + for idx in range(len(lines) - 1): + current = lines[idx] + nxt = lines[idx + 1] + if nxt[:1].isdigit(): + if _ends_with_truncated_marker(current) or current[-1] not in (".", "!", "?", ":", ")"): + return True + tail = stripped.rsplit(None, 1)[-1].lower() + return tail in { + "a", + "an", + "and", + "as", + "because", + "but", + "for", + "if", + "in", + "of", + "or", + "so", + "the", + "to", + "with", + } + + +def _is_topical(text: str, required_terms: list[str], forbidden_terms: list[str] | None = None) -> bool: + lowered = text.lower() + if forbidden_terms and any(term in lowered for term in forbidden_terms): + return False + return any(term in lowered for term in required_terms) + + +def _generate_checked( + provider: ModelProvider, + prompt: str, + role: str, + system_prompt: str, + required_terms: list[str], + forbidden_terms: list[str] | None = None, + temperature: float = 0.2, + max_tokens: int = 180, + retries: int = 2, + status_callback=None, +) -> str: + attempt_prompt = prompt + for _ in range(retries + 1): + text = provider.generate( + attempt_prompt, + role=role, + system_prompt=system_prompt, + temperature=temperature, + max_tokens=max_tokens, + status_callback=status_callback, + ).text + if _is_topical(text, required_terms, forbidden_terms): + completed = text.strip() + continuation_budget = max(64, max_tokens // 2) + for _continuation_idx in range(3): + if not _looks_truncated(completed): + break + continuation = provider.generate( + "Continue the previous response without restarting it. Finish the thought cleanly and end with a complete sentence.\n\n" + f"Current draft:\n{completed}", + role=role, + system_prompt=system_prompt, + temperature=min(temperature, 0.2), + max_tokens=continuation_budget, + status_callback=status_callback, + ).text.strip() + if not continuation or continuation == completed: + break + completed = f"{completed.rstrip()} {continuation.lstrip()}" + if not _looks_truncated(continuation): + break + return _normalize_completion(completed) + attempt_prompt = ( + prompt + + " Stay strictly within information theory, entropy, probability, coding, or thermodynamics. " + + "Do not switch to politics, programming style, or unrelated topics." + ) + return _normalize_completion(text) + + +def _concept_title_map(context) -> dict[str, str]: + return {concept.get("id", ""): concept.get("title", concept.get("id", "")) for concept in context.concepts} + + +def _path_titles(context, limit: int | None = None) -> list[str]: + title_map = _concept_title_map(context) + titles: list[str] = [] + for concept_key in context.run_summary.get("curriculum_path", []): + concept_id = concept_key.split("::", 1)[-1] + titles.append(title_map.get(concept_id, concept_id.replace("-", " ").title())) + return titles[:limit] if limit is not None else titles + + +def _healthy_rolemesh_models(provider: ModelProvider) -> set[str]: + config = provider.config + if config.provider.lower() != "rolemesh": + return set() + models = set(config.rolemesh.role_to_model.values()) | {config.rolemesh.default_model} + healthy: set[str] = set() + for model in models: + try: + provider._rolemesh_chat_completion( # type: ignore[attr-defined] + { + "model": model, + "messages": [{"role": "user", "content": "Reply with ok."}], + "max_tokens": 16, + "temperature": 0.0, + } + ) + healthy.add(model) + except Exception: + continue + return healthy + + +def _apply_rolemesh_fallbacks(provider: ModelProvider) -> dict[str, str]: + config = provider.config + if config.provider.lower() != "rolemesh": + return {} + healthy = _healthy_rolemesh_models(provider) + if not healthy: + raise RuntimeError("No healthy RoleMesh models available for transcript generation.") + fallback_model = config.rolemesh.default_model if config.rolemesh.default_model in healthy else sorted(healthy)[0] + adjusted = {} + for role, model in list(config.rolemesh.role_to_model.items()): + if model not in healthy: + config.rolemesh.role_to_model[role] = fallback_model + adjusted[role] = fallback_model + return adjusted + + +def build_ocw_rolemesh_transcript(config_path: str | Path, skill_dir: str | Path) -> dict: + config = load_config(config_path) + provider = ModelProvider(config.model_provider) + context = load_ocw_skill_context(skill_dir) + role_fallbacks = _apply_rolemesh_fallbacks(provider) + status_updates: list[str] = [] + + def emit_status(message: str) -> None: + status_updates.append(message) + print(message, file=sys.stderr, flush=True) + + goal = ( + "I want to use the MIT OCW Information and Entropy course to understand how Shannon entropy, " + "channel capacity, and thermodynamic entropy relate without skipping the reasoning." + ) + path_titles = _path_titles(context) + turns: list[dict[str, str]] = [] + turns.append(_format_turn("user", "Learner Goal", goal)) + + mentor_open = _generate_checked( + provider, + "The learner wants to approach the Information and Entropy course carefully. " + "Ask a short opening question that checks what they already understand and points them to the first concept.", + role="mentor", + system_prompt=mentor_system_prompt(), + required_terms=["information", "entropy", "probability", "counting"], + forbidden_terms=["president", "executive branch", "code"], + temperature=0.2, + max_tokens=140, + status_callback=emit_status, + ) + turns.append(_format_turn("assistant", "Didactopus Mentor", mentor_open)) + + learner_reflection = _generate_checked( + provider, + "Respond as the learner. Mention partial understanding of randomness and probability, but uncertainty about " + "how that becomes entropy and communication limits in information theory.", + role="learner", + system_prompt=learner_system_prompt(), + required_terms=["probability", "entropy", "information", "uncertainty"], + forbidden_terms=["president", "executive branch", "code"], + temperature=0.5, + max_tokens=140, + status_callback=emit_status, + ) + turns.append(_format_turn("assistant", "AI Learner", learner_reflection)) + + mentor_guidance = _generate_checked( + provider, + "Given the learner reflection, explain the first two concepts to study from the generated path and why. " + f"Path reference: {path_titles[:4]}", + role="mentor", + system_prompt=mentor_system_prompt(), + required_terms=["counting", "probability", "entropy", "information"], + forbidden_terms=["president", "executive branch", "code"], + temperature=0.2, + max_tokens=280, + status_callback=emit_status, + ) + turns.append(_format_turn("assistant", "Didactopus Mentor", mentor_guidance)) + + practice_task = _generate_checked( + provider, + "Generate one short practice task that forces the learner to connect counting/probability with Shannon entropy, " + "without giving away the full answer.", + role="practice", + system_prompt=practice_system_prompt(), + required_terms=["entropy", "probability", "die", "uncertainty", "shannon"], + forbidden_terms=["president", "executive branch", "code"], + temperature=0.4, + max_tokens=220, + status_callback=emit_status, + ) + turns.append(_format_turn("assistant", "Didactopus Practice Designer", practice_task)) + + learner_attempt = _generate_checked( + provider, + f"Respond as the learner attempting this task in information theory: {practice_task} " + "Give a concise answer in your own words, with one uncertainty still present.", + role="learner", + system_prompt=learner_system_prompt(), + required_terms=["entropy", "probability", "uncertainty", "die", "message"], + forbidden_terms=["president", "executive branch", "code"], + temperature=0.5, + max_tokens=280, + status_callback=emit_status, + ) + turns.append(_format_turn("assistant", "AI Learner", learner_attempt)) + + evaluator_feedback = _generate_checked( + provider, + "Evaluate this learner attempt for correctness, explanation quality, and limitations. " + f"Task: {practice_task}\nAttempt: {learner_attempt}", + role="evaluator", + system_prompt=evaluator_system_prompt(), + required_terms=["correctness", "entropy", "probability", "explanation", "limitation"], + forbidden_terms=["president", "executive branch", "code structure"], + temperature=0.2, + max_tokens=260, + status_callback=emit_status, + ) + turns.append(_format_turn("assistant", "Didactopus Evaluator", evaluator_feedback)) + + mentor_next_step = _generate_checked( + provider, + "Given the evaluator feedback, tell the learner what to do next before moving on to channel capacity. " + "Use the course path to show what comes next.", + role="mentor", + system_prompt=mentor_system_prompt(), + required_terms=["channel capacity", "entropy", "probability", "next"], + forbidden_terms=["president", "executive branch", "code structure"], + temperature=0.2, + max_tokens=220, + status_callback=emit_status, + ) + turns.append(_format_turn("assistant", "Didactopus Mentor", mentor_next_step)) + + stage_specs = [ + { + "topic": "Channel Capacity", + "path_slice": path_titles[4:7] or path_titles, + "practice_anchor": "binary symmetric channel", + "required_terms": ["channel", "capacity", "entropy", "noise"], + }, + { + "topic": "Coding and Compression", + "path_slice": path_titles[5:9] or path_titles, + "practice_anchor": "compression and error-correcting code", + "required_terms": ["coding", "compression", "redundancy", "error"], + }, + { + "topic": "Thermodynamic Entropy and Synthesis", + "path_slice": path_titles[8:] or path_titles, + "practice_anchor": "thermodynamic entropy", + "required_terms": ["thermodynamic", "entropy", "information", "physical"], + }, + ] + + for stage in stage_specs: + mentor_stage = _generate_checked( + provider, + f"The learner is continuing through the MIT OCW Information and Entropy course. " + f"Bridge from the previous work into {stage['topic']}. " + f"Reference this path segment: {stage['path_slice']}. " + "Explain what the learner should focus on before attempting a problem.", + role="mentor", + system_prompt=mentor_system_prompt(), + required_terms=stage["required_terms"], + forbidden_terms=["president", "executive branch", "code structure"], + temperature=0.2, + max_tokens=260, + status_callback=emit_status, + ) + turns.append(_format_turn("assistant", "Didactopus Mentor", mentor_stage)) + + learner_stage = _generate_checked( + provider, + f"Respond as the learner after studying {stage['topic']}. " + f"Summarize what now makes sense and one remaining difficulty about {stage['practice_anchor']}.", + role="learner", + system_prompt=learner_system_prompt(), + required_terms=stage["required_terms"], + forbidden_terms=["president", "executive branch", "code structure"], + temperature=0.4, + max_tokens=220, + status_callback=emit_status, + ) + turns.append(_format_turn("assistant", "AI Learner", learner_stage)) + + practice_stage = _generate_checked( + provider, + f"Create one short reasoning task about {stage['practice_anchor']} for the learner. " + "Keep it course-relevant and do not provide the full solution.", + role="practice", + system_prompt=practice_system_prompt(), + required_terms=stage["required_terms"], + forbidden_terms=["president", "executive branch", "code structure"], + temperature=0.3, + max_tokens=220, + status_callback=emit_status, + ) + turns.append(_format_turn("assistant", "Didactopus Practice Designer", practice_stage)) + + evaluator_stage = _generate_checked( + provider, + f"Give short evaluator feedback on this learner reflection in the context of {stage['topic']}: " + f"{learner_stage}\nTask context: {practice_stage}", + role="evaluator", + system_prompt=evaluator_system_prompt(), + required_terms=["correctness", "explanation", *stage["required_terms"][:2]], + forbidden_terms=["president", "executive branch", "code structure"], + temperature=0.2, + max_tokens=220, + status_callback=emit_status, + ) + turns.append(_format_turn("assistant", "Didactopus Evaluator", evaluator_stage)) + + return { + "provider": config.model_provider.provider, + "skill": context.skill_name, + "course": context.pack.get("display_name", context.pack.get("name", "")), + "curriculum_path_titles": path_titles, + "role_fallbacks": role_fallbacks, + "status_updates": status_updates, + "transcript": turns, + } + + +def write_transcript_artifacts(payload: dict, out_dir: str | Path) -> dict[str, str]: + out_dir = Path(out_dir) + out_dir.mkdir(parents=True, exist_ok=True) + json_path = out_dir / "rolemesh_transcript.json" + md_path = out_dir / "rolemesh_transcript.md" + json_path.write_text(json.dumps(payload, indent=2), encoding="utf-8") + + lines = [ + "# Local LLM Transcript: MIT OCW Information and Entropy", + "", + f"- Provider: `{payload['provider']}`", + f"- Skill: `{payload['skill']}`", + f"- Course: `{payload['course']}`", + "", + ] + if payload.get("status_updates"): + lines.append("## Pending Status Examples") + for update in payload["status_updates"][:8]: + lines.append(f"- {update}") + lines.append("") + for turn in payload["transcript"]: + lines.append(f"## {turn['speaker']}") + lines.append(turn["content"]) + lines.append("") + md_path.write_text("\n".join(lines), encoding="utf-8") + return {"json": str(json_path), "md": str(md_path)} + + +def run_ocw_rolemesh_transcript_demo(config_path: str | Path, skill_dir: str | Path, out_dir: str | Path) -> dict: + payload = build_ocw_rolemesh_transcript(config_path, skill_dir) + outputs = write_transcript_artifacts(payload, out_dir) + payload["artifacts"] = outputs + return payload + + +def main() -> None: + import argparse + + root = Path(__file__).resolve().parents[2] + parser = argparse.ArgumentParser(description="Generate a transcript of an AI learner using a local LLM path to approach the MIT OCW Information and Entropy course.") + parser.add_argument("--config", default=str(root / "configs" / "config.rolemesh.example.yaml")) + parser.add_argument("--skill-dir", default=str(root / "skills" / "ocw-information-entropy-agent")) + parser.add_argument("--out-dir", default=str(root / "examples" / "ocw-information-entropy-rolemesh-transcript")) + args = parser.parse_args() + payload = run_ocw_rolemesh_transcript_demo(args.config, args.skill_dir, args.out_dir) + print(json.dumps(payload, indent=2)) + + +if __name__ == "__main__": + main() diff --git a/src/didactopus/pack_emitter.py b/src/didactopus/pack_emitter.py index 9afdc84..37dcc6a 100644 --- a/src/didactopus/pack_emitter.py +++ b/src/didactopus/pack_emitter.py @@ -6,6 +6,64 @@ import yaml from .course_schema import NormalizedCourse, ConceptCandidate, DraftPack +def build_source_corpus(course: NormalizedCourse) -> dict: + fragments = [] + for module in course.modules: + for lesson in module.lessons: + body = lesson.body.strip() + if body: + fragments.append( + { + "fragment_id": f"{module.title}::{lesson.title}::body".lower().replace(" ", "-"), + "kind": "lesson_body", + "module_title": module.title, + "lesson_title": lesson.title, + "text": body, + "source_refs": list(lesson.source_refs), + "objectives": list(lesson.objectives), + "exercises": list(lesson.exercises), + "key_terms": list(lesson.key_terms), + } + ) + for idx, objective in enumerate(lesson.objectives, start=1): + fragments.append( + { + "fragment_id": f"{module.title}::{lesson.title}::objective-{idx}".lower().replace(" ", "-"), + "kind": "objective", + "module_title": module.title, + "lesson_title": lesson.title, + "text": objective, + "source_refs": list(lesson.source_refs), + } + ) + for idx, exercise in enumerate(lesson.exercises, start=1): + fragments.append( + { + "fragment_id": f"{module.title}::{lesson.title}::exercise-{idx}".lower().replace(" ", "-"), + "kind": "exercise", + "module_title": module.title, + "lesson_title": lesson.title, + "text": exercise, + "source_refs": list(lesson.source_refs), + } + ) + + return { + "course_title": course.title, + "rights_note": course.rights_note, + "sources": [ + { + "source_path": src.source_path, + "source_type": src.source_type, + "title": src.title, + "metadata": getattr(src, "metadata", {}), + } + for src in course.source_records + ], + "fragments": fragments, + } + + def build_draft_pack( course: NormalizedCourse, concepts: list[ConceptCandidate], @@ -29,6 +87,7 @@ def build_draft_pack( "overrides": [], "profile_templates": {}, "cross_pack_links": [], + "supporting_artifacts": ["source_corpus.json"], } concepts_yaml = { "concepts": [ @@ -100,3 +159,9 @@ def write_draft_pack(pack: DraftPack, outdir: str | Path) -> None: conflict_lines = ["# Conflict Report", ""] + [f"- {flag}" for flag in pack.conflicts] if pack.conflicts else ["# Conflict Report", "", "- none"] (out / "conflict_report.md").write_text("\n".join(conflict_lines), encoding="utf-8") (out / "license_attribution.json").write_text(json.dumps(pack.attribution, indent=2), encoding="utf-8") + + +def write_source_corpus(course: NormalizedCourse, outdir: str | Path) -> None: + out = Path(outdir) + out.mkdir(parents=True, exist_ok=True) + (out / "source_corpus.json").write_text(json.dumps(build_source_corpus(course), indent=2), encoding="utf-8") diff --git a/src/didactopus/practice.py b/src/didactopus/practice.py index 4d58d69..50855c9 100644 --- a/src/didactopus/practice.py +++ b/src/didactopus/practice.py @@ -1,4 +1,5 @@ from .model_provider import ModelProvider +from .role_prompts import practice_system_prompt def generate_practice_task(provider: ModelProvider, concept: str, weak_dimensions: list[str] | None = None) -> str: @@ -6,5 +7,7 @@ def generate_practice_task(provider: ModelProvider, concept: str, weak_dimension if weak_dimensions: weak_text = f" Target the weak dimensions: {', '.join(weak_dimensions)}." return provider.generate( - f"Generate one reasoning-heavy practice task for '{concept}'.{weak_text}" + f"Generate one reasoning-heavy practice task for '{concept}'.{weak_text}", + role="practice", + system_prompt=practice_system_prompt(), ).text diff --git a/src/didactopus/project_advisor.py b/src/didactopus/project_advisor.py index 4d79e43..60a8b07 100644 --- a/src/didactopus/project_advisor.py +++ b/src/didactopus/project_advisor.py @@ -1,7 +1,10 @@ from .model_provider import ModelProvider +from .role_prompts import project_advisor_system_prompt def suggest_capstone(provider: ModelProvider, domain: str) -> str: return provider.generate( - f"Suggest one realistic capstone project for mastery in {domain}." + f"Suggest one realistic capstone project for mastery in {domain}.", + role="project_advisor", + system_prompt=project_advisor_system_prompt(), ).text diff --git a/src/didactopus/role_prompts.py b/src/didactopus/role_prompts.py new file mode 100644 index 0000000..e0da37a --- /dev/null +++ b/src/didactopus/role_prompts.py @@ -0,0 +1,41 @@ +from __future__ import annotations + + +def mentor_system_prompt() -> str: + return ( + "You are Didactopus in mentor mode. Help the learner think through the topic without doing the work for them. " + "Prefer Socratic questions, prerequisite reminders, and hints over finished solutions. " + "When responding to a learner attempt or evaluator note, acknowledge what the learner already did correctly before naming gaps. " + "Do not claim a caveat, limitation, or nuance is missing if the learner already stated one; instead say how to sharpen or extend it." + ) + + +def practice_system_prompt() -> str: + return ( + "You are Didactopus in practice-design mode. Generate short, reasoning-heavy tasks that force the learner " + "to explain, compare, or derive ideas rather than copy answers." + ) + + +def learner_system_prompt() -> str: + return ( + "You are an earnest AI learner using Didactopus to study a topic. Think aloud briefly, attempt the task yourself, " + "and avoid asking for the final answer to be given to you outright." + ) + + +def project_advisor_system_prompt() -> str: + return ( + "You are Didactopus in capstone-advisor mode. Suggest realistic project ideas that require synthesis and " + "independent execution, and avoid proposing tasks that can be completed by rote prompting alone." + ) + + +def evaluator_system_prompt() -> str: + return ( + "You are Didactopus in evaluator mode. Assess clarity, reasoning, and limitations explicitly. " + "Point out weak assumptions and missing justification instead of giving the polished final answer. " + "Before saying something is missing, first verify whether the learner already included it. " + "If the learner stated a caveat, limitation, or nuance, quote or paraphrase that part and evaluate its quality rather than pretending it is absent. " + "Do not invent omissions that are contradicted by the learner's actual text." + ) diff --git a/src/didactopus/rolemesh_demo.py b/src/didactopus/rolemesh_demo.py new file mode 100644 index 0000000..c895776 --- /dev/null +++ b/src/didactopus/rolemesh_demo.py @@ -0,0 +1,54 @@ +from __future__ import annotations + +import json +from pathlib import Path + +from .config import load_config +from .mentor import generate_socratic_prompt +from .model_provider import ModelProvider +from .practice import generate_practice_task +from .project_advisor import suggest_capstone +from .role_prompts import evaluator_system_prompt + + +def run_rolemesh_demo(config_path: str | Path, out_path: str | Path | None = None) -> dict: + config = load_config(config_path) + provider = ModelProvider(config.model_provider) + + payload = { + "provider": config.model_provider.provider, + "mentor_prompt": generate_socratic_prompt(provider, "channel capacity", ["explanation"]), + "practice_task": generate_practice_task(provider, "Shannon entropy", ["transfer"]), + "capstone": suggest_capstone(provider, "information theory"), + "evaluation_instruction": provider.generate( + "Evaluate a learner explanation of thermodynamic entropy versus Shannon entropy.", + role="evaluator", + system_prompt=evaluator_system_prompt(), + ).text, + } + + if out_path is not None: + Path(out_path).write_text(json.dumps(payload, indent=2), encoding="utf-8") + return payload + + +def main() -> None: + import argparse + + root = Path(__file__).resolve().parents[2] + parser = argparse.ArgumentParser(description="Run a Didactopus demo against a local RoleMesh-compatible model provider.") + parser.add_argument( + "--config", + default=str(root / "configs" / "config.rolemesh.example.yaml"), + ) + parser.add_argument( + "--out", + default=str(root / "examples" / "rolemesh_demo.json"), + ) + args = parser.parse_args() + payload = run_rolemesh_demo(args.config, args.out) + print(json.dumps(payload, indent=2)) + + +if __name__ == "__main__": + main() diff --git a/src/didactopus/topic_ingest.py b/src/didactopus/topic_ingest.py index b4cb94d..7823d39 100644 --- a/src/didactopus/topic_ingest.py +++ b/src/didactopus/topic_ingest.py @@ -4,6 +4,47 @@ import re from collections import defaultdict from .course_schema import NormalizedDocument, NormalizedCourse, Module, Lesson, TopicBundle, ConceptCandidate +GENERIC_TERM_STOPWORDS = { + "attribution", + "build", + "careful", + "compare", + "comparison", + "compute", + "course", + "decide", + "describe", + "didactopus", + "early", + "exercise", + "explain", + "home", + "identify", + "independent", + "later", + "list", + "notes", + "objective", + "open", + "opencourseware", + "produce", + "programming", + "reference", + "source", + "spring", + "state", + "structure", + "summarize", + "syllabus", + "synthesis", + "synthesize", + "texts", + "these", + "ultimate", + "unit", + "work", + "write", +} def slugify(text: str) -> str: cleaned = re.sub(r"[^a-zA-Z0-9]+", "-", text.strip().lower()).strip("-") @@ -15,6 +56,9 @@ def extract_key_terms(text: str, min_term_length: int = 4, max_terms: int = 8) - seen = set() out = [] for term in candidates: + lower = term.lower() + if lower in GENERIC_TERM_STOPWORDS: + continue if term not in seen: seen.add(term) out.append(term) @@ -23,6 +67,16 @@ def extract_key_terms(text: str, min_term_length: int = 4, max_terms: int = 8) - return out +def _parse_signal_line(line: str) -> tuple[str | None, str]: + stripped = line.strip() + if stripped.startswith(("-", "*", "+")): + stripped = stripped[1:].strip() + lowered = stripped.lower() + if lowered.startswith("objective:"): + return "objective", stripped.split(":", 1)[1].strip() + if lowered.startswith("exercise:"): + return "exercise", stripped.split(":", 1)[1].strip() + return None, stripped def document_to_course(doc: NormalizedDocument, course_title: str) -> NormalizedCourse: # Conservative mapping: each section becomes a lesson; all lessons go into one module. lessons = [] @@ -34,18 +88,18 @@ def document_to_course(doc: NormalizedDocument, course_title: str) -> Normalized objectives = [] exercises = [] for line in lines: - low = line.lower().strip() - if low.startswith("objective:"): - objectives.append(line.split(":", 1)[1].strip()) - if low.startswith("exercise:"): - exercises.append(line.split(":", 1)[1].strip()) + kind, value = _parse_signal_line(line) + if kind == "objective": + objectives.append(value) + if kind == "exercise": + exercises.append(value) lessons.append( Lesson( title=section.heading.strip() or "Untitled Lesson", body=body, objectives=objectives, exercises=exercises, - key_terms=extract_key_terms(section.heading + "\n" + body), + key_terms=extract_key_terms(body), source_refs=[doc.source_path], ) ) @@ -113,6 +167,8 @@ def extract_concept_candidates(course: NormalizedCourse) -> list[ConceptCandidat tid = slugify(term) if tid in seen_ids: continue + if tid in {slugify(part) for part in lesson.title.split()}: + continue seen_ids.add(tid) concepts.append( ConceptCandidate( diff --git a/tests/test_config.py b/tests/test_config.py index 85c8279..af30077 100644 --- a/tests/test_config.py +++ b/tests/test_config.py @@ -6,3 +6,11 @@ def test_load_example_config() -> None: config = load_config(Path("configs/config.example.yaml")) assert config.platform.dimension_thresholds["transfer"] == 0.7 assert config.platform.confidence_threshold == 0.8 + assert config.model_provider.provider == "stub" + + +def test_load_rolemesh_config() -> None: + config = load_config(Path("configs/config.rolemesh.example.yaml")) + assert config.model_provider.provider == "rolemesh" + assert config.model_provider.rolemesh.role_to_model["mentor"] == "planner" + assert config.model_provider.rolemesh.role_to_model["learner"] == "writer" diff --git a/tests/test_model_provider.py b/tests/test_model_provider.py new file mode 100644 index 0000000..8e5a057 --- /dev/null +++ b/tests/test_model_provider.py @@ -0,0 +1,82 @@ +from didactopus.config import ModelProviderConfig +from didactopus.model_provider import ModelProvider +from didactopus.role_prompts import evaluator_system_prompt, mentor_system_prompt + + +def test_stub_provider_includes_role_marker() -> None: + provider = ModelProvider(ModelProviderConfig()) + response = provider.generate("Explain entropy simply.", role="mentor") + assert response.provider == "stub" + assert "[mentor]" in response.text + + +def test_rolemesh_provider_uses_role_mapping() -> None: + config = ModelProviderConfig.model_validate( + { + "provider": "rolemesh", + "rolemesh": { + "base_url": "http://127.0.0.1:8000", + "api_key": "demo", + "default_model": "planner", + "role_to_model": {"mentor": "planner", "practice": "writer"}, + }, + } + ) + provider = ModelProvider(config) + + def fake_chat(payload: dict) -> dict: + assert payload["model"] == "writer" + assert payload["messages"][0]["role"] == "system" + return {"choices": [{"message": {"content": "Practice task response"}}]} + + provider._rolemesh_chat_completion = fake_chat # type: ignore[method-assign] + response = provider.generate( + "Generate a practice task.", + role="practice", + system_prompt="System prompt", + ) + assert response.provider == "rolemesh" + assert response.model_name == "writer" + assert response.text == "Practice task response" + + +def test_rolemesh_provider_emits_pending_notice() -> None: + config = ModelProviderConfig.model_validate( + { + "provider": "rolemesh", + "rolemesh": { + "base_url": "http://127.0.0.1:8000", + "api_key": "demo", + "default_model": "planner", + "role_to_model": {"evaluator": "reviewer"}, + }, + } + ) + provider = ModelProvider(config) + seen: list[str] = [] + + def fake_chat(payload: dict) -> dict: + return {"choices": [{"message": {"content": "Evaluation response"}}]} + + provider._rolemesh_chat_completion = fake_chat # type: ignore[method-assign] + response = provider.generate( + "Evaluate a learner answer.", + role="evaluator", + status_callback=seen.append, + ) + + assert response.text == "Evaluation response" + assert seen == ["Didactopus is evaluating the work before replying. Model: reviewer."] + + +def test_evaluator_prompt_requires_checking_existing_caveats() -> None: + prompt = evaluator_system_prompt().lower() + assert "before saying something is missing" in prompt + assert "quote or paraphrase" in prompt + assert "do not invent omissions" in prompt + + +def test_mentor_prompt_requires_acknowledging_existing_caveats() -> None: + prompt = mentor_system_prompt().lower() + assert "acknowledge what the learner already did correctly" in prompt + assert "do not claim a caveat" in prompt diff --git a/tests/test_ocw_information_entropy_demo.py b/tests/test_ocw_information_entropy_demo.py index a8407b7..14e0da9 100644 --- a/tests/test_ocw_information_entropy_demo.py +++ b/tests/test_ocw_information_entropy_demo.py @@ -1,4 +1,5 @@ from pathlib import Path +import json from didactopus.ocw_information_entropy_demo import run_ocw_information_entropy_demo @@ -6,7 +7,7 @@ from didactopus.ocw_information_entropy_demo import run_ocw_information_entropy_ def test_ocw_information_entropy_demo_generates_pack_and_skill(tmp_path: Path) -> None: root = Path(__file__).resolve().parents[1] summary = run_ocw_information_entropy_demo( - course_source=root / "examples" / "ocw-information-entropy" / "6-050j-information-and-entropy.md", + course_source=root / "examples" / "ocw-information-entropy" / "course", source_inventory=root / "examples" / "ocw-information-entropy" / "sources.yaml", pack_dir=tmp_path / "pack", run_dir=tmp_path / "run", @@ -15,7 +16,38 @@ def test_ocw_information_entropy_demo_generates_pack_and_skill(tmp_path: Path) - assert (tmp_path / "pack" / "pack.yaml").exists() assert (tmp_path / "pack" / "pack_compliance_manifest.json").exists() + assert (tmp_path / "pack" / "source_corpus.json").exists() assert (tmp_path / "run" / "capability_profile.json").exists() assert (tmp_path / "skill" / "SKILL.md").exists() assert summary["target_concept"].endswith("thermodynamics-and-entropy") assert summary["mastered_concepts"] + assert summary["source_document_count"] >= 1 + assert summary["source_fragment_count"] >= 1 + + +def test_ocw_demo_accepts_directory_tree_sources(tmp_path: Path) -> None: + source_dir = tmp_path / "course" + source_dir.mkdir() + (source_dir / "unit1.md").write_text( + "# Course\n\n## Unit 1\n### Counting and Probability\n- Objective: Explain counting arguments.\nBody text.", + encoding="utf-8", + ) + (source_dir / "unit2.txt").write_text( + "## Unit 2\n### Shannon Entropy\nObjective: Relate uncertainty and entropy.\nExercise: Compare two distributions.", + encoding="utf-8", + ) + sources = tmp_path / "sources.yaml" + sources.write_text("sources: []\n", encoding="utf-8") + + summary = run_ocw_information_entropy_demo( + course_source=source_dir, + source_inventory=sources, + pack_dir=tmp_path / "pack", + run_dir=tmp_path / "run", + skill_dir=tmp_path / "skill", + ) + + corpus = json.loads((tmp_path / "pack" / "source_corpus.json").read_text(encoding="utf-8")) + assert summary["source_document_count"] == 2 + assert len(corpus["sources"]) == 2 + assert any(fragment["lesson_title"] == "Shannon Entropy" for fragment in corpus["fragments"]) diff --git a/tests/test_ocw_rolemesh_transcript_demo.py b/tests/test_ocw_rolemesh_transcript_demo.py new file mode 100644 index 0000000..83a6429 --- /dev/null +++ b/tests/test_ocw_rolemesh_transcript_demo.py @@ -0,0 +1,39 @@ +from pathlib import Path + +from didactopus.ocw_rolemesh_transcript_demo import _looks_truncated, run_ocw_rolemesh_transcript_demo + + +def test_looks_truncated_detects_prose_cutoff_before_numbered_list() -> None: + text = ( + "Suppose we have a binary symmetric channel with crossover\n\n" + "1. Estimate the error probability.\n" + "2. Relate it to capacity." + ) + assert _looks_truncated(text) is True + + +def test_looks_truncated_detects_common_cutoff_phrase() -> None: + assert _looks_truncated("Furthermore") is True + assert _looks_truncated("Compare the entropy of one roll with the") is True + + +def test_ocw_rolemesh_transcript_demo_writes_artifacts(tmp_path: Path) -> None: + root = Path(__file__).resolve().parents[1] + payload = run_ocw_rolemesh_transcript_demo( + root / "configs" / "config.example.yaml", + root / "skills" / "ocw-information-entropy-agent", + tmp_path, + ) + + assert payload["provider"] == "stub" + assert len(payload["transcript"]) >= 16 + assert len(payload["curriculum_path_titles"]) >= 8 + assert payload["role_fallbacks"] == {} + assert payload["status_updates"] == [] + assert any(turn["speaker"] == "Didactopus Evaluator" for turn in payload["transcript"]) + assert any("channel" in turn["content"].lower() for turn in payload["transcript"]) + assert any("thermodynamic" in turn["content"].lower() for turn in payload["transcript"]) + assert all(not _looks_truncated(turn["content"]) for turn in payload["transcript"]) + assert (tmp_path / "rolemesh_transcript.json").exists() + assert (tmp_path / "rolemesh_transcript.md").exists() + assert "Pending Status Examples" not in (tmp_path / "rolemesh_transcript.md").read_text(encoding="utf-8") diff --git a/tests/test_pack_emitter_source_corpus.py b/tests/test_pack_emitter_source_corpus.py new file mode 100644 index 0000000..3181dcc --- /dev/null +++ b/tests/test_pack_emitter_source_corpus.py @@ -0,0 +1,31 @@ +from pathlib import Path +import json + +from didactopus.course_ingest import parse_markdown_course +from didactopus.pack_emitter import build_source_corpus, write_source_corpus + + +SAMPLE = """ +# Sample Course + +## Module 1 +### Lesson A +- Objective: Explain Topic A. +- Exercise: Solve Task A. +Topic A body. +""" + + +def test_build_source_corpus_preserves_lesson_text_and_signals(tmp_path: Path) -> None: + course = parse_markdown_course(SAMPLE, "Sample Course") + corpus = build_source_corpus(course) + + assert corpus["course_title"] == "Sample Course" + assert corpus["sources"] + assert any(fragment["kind"] == "lesson_body" and "Topic A body." in fragment["text"] for fragment in corpus["fragments"]) + assert any(fragment["kind"] == "objective" and "Explain Topic A." in fragment["text"] for fragment in corpus["fragments"]) + assert any(fragment["kind"] == "exercise" and "Solve Task A." in fragment["text"] for fragment in corpus["fragments"]) + + write_source_corpus(course, tmp_path) + written = json.loads((tmp_path / "source_corpus.json").read_text(encoding="utf-8")) + assert written["fragments"] diff --git a/tests/test_rolemesh_demo.py b/tests/test_rolemesh_demo.py new file mode 100644 index 0000000..b1bc792 --- /dev/null +++ b/tests/test_rolemesh_demo.py @@ -0,0 +1,15 @@ +from pathlib import Path + +from didactopus.rolemesh_demo import run_rolemesh_demo + + +def test_run_rolemesh_demo_writes_output(tmp_path: Path) -> None: + root = Path(__file__).resolve().parents[1] + payload = run_rolemesh_demo( + root / "configs" / "config.example.yaml", + tmp_path / "rolemesh_demo.json", + ) + + assert (tmp_path / "rolemesh_demo.json").exists() + assert payload["provider"] == "stub" + assert "mentor" in payload["mentor_prompt"] diff --git a/tests/test_topic_ingest.py b/tests/test_topic_ingest.py index 67fa154..f43127e 100644 --- a/tests/test_topic_ingest.py +++ b/tests/test_topic_ingest.py @@ -32,3 +32,31 @@ def test_document_to_course_skips_empty_sections(tmp_path: Path) -> None: doc = adapt_document(a) course = document_to_course(doc, "Topic") assert [lesson.title for lesson in course.modules[0].lessons] == ["Filled"] + + +def test_document_to_course_parses_bulleted_objectives_and_exercises(tmp_path: Path) -> None: + a = tmp_path / "a.md" + a.write_text( + "# T\n\n## M\n### Shannon Entropy\n- Objective: Explain uncertainty.\n- Exercise: Compute entropy.\nBody.", + encoding="utf-8", + ) + doc = adapt_document(a) + course = document_to_course(doc, "Topic") + lesson = course.modules[0].lessons[0] + assert lesson.objectives == ["Explain uncertainty."] + assert lesson.exercises == ["Compute entropy."] + + +def test_extract_concepts_retains_lessons_but_filters_generic_terms(tmp_path: Path) -> None: + a = tmp_path / "a.md" + a.write_text( + "# T\n\n## M\n### MIT OCW 6.050J Information and Entropy: Syllabus\n- Objective: Explain the course.\nBody.\n\n### Channel Capacity\n- Objective: Explain noisy channels.\n- Exercise: State a capacity limit.\nChannel Capacity links reliable communication to noise and coding.", + encoding="utf-8", + ) + doc = adapt_document(a) + course = document_to_course(doc, "Topic") + concepts = extract_concept_candidates(course) + titles = {concept.title for concept in concepts} + assert "MIT OCW 6.050J Information and Entropy: Syllabus" in titles + assert "Explain" not in titles + assert "Channel Capacity" in titles