Revision of license acknowledgements process, artifacts.

This commit is contained in:
welsberr 2026-03-15 10:04:30 -04:00
parent 3b99ffb179
commit 5969a932d3
27 changed files with 913 additions and 5 deletions

View File

@ -163,6 +163,7 @@ generates:
- learner outputs in `examples/ocw-information-entropy-run/`
- a repo-local skill bundle in `skills/ocw-information-entropy-agent/`
- an agentic skill-usage demo in `examples/ocw-information-entropy-skill-demo/`
- compliance artifacts including `pack_compliance_manifest.json` and `source_inventory.yaml`
## What visualizations exist today?
@ -218,4 +219,3 @@ via Codex to confirm functionality, fix bugs, and update
documentation. Elsberry provided goals, direction, operational
principles, and orchestration, and generative AI has provided pretty
much the rest.

View File

@ -36,6 +36,12 @@ A derived pack should carry:
- share_alike_required
- noncommercial_only
The recommended route in this repository is:
1. maintain a `sources.yaml` inventory for the source set
2. generate `pack_compliance_manifest.json`
3. keep `license_attribution.json` for human-facing attribution details
## MIT OCW-specific pattern
For MIT OpenCourseWare-derived packs, treat the course material as licensed content while separately recording:
@ -43,3 +49,5 @@ For MIT OpenCourseWare-derived packs, treat the course material as licensed cont
- image/video exceptions
- linked-content exceptions
- any asset not safely covered by the course-level reuse assumption
The MIT OCW Information and Entropy demo in this repository follows that pattern and can be used as the reference implementation.

View File

@ -0,0 +1,93 @@
# Working With Other MIT OCW Courses
This is the recommended pattern for bringing more MIT OpenCourseWare courses into Didactopus.
## Goal
Use MIT OCW as a structured source for learning, while preserving:
- attribution
- license references
- adaptation status
- noncommercial/share-alike flags
- a place to record excluded third-party content when it appears
## Minimal workflow
1. Pick a course and collect the specific pages you are actually using.
2. Create a local derived source file for reproducible ingestion.
3. Create a `sources.yaml` inventory beside that source file.
4. Run the ingestion/demo pipeline and emit a `pack_compliance_manifest.json`.
5. Review the generated pack before treating it as reusable teaching material.
## Recommended directory shape
For a new MIT OCW-derived example, mirror the existing pattern:
```text
examples/<course-slug>/
course-source.md
sources.yaml
```
The corresponding generated outputs should include:
```text
domain-packs/<course-slug>/
license_attribution.json
pack_compliance_manifest.json
source_inventory.yaml
```
## What goes in `sources.yaml`
Record each course page or resource page that materially informed the generated pack.
At minimum include:
- `source_id`
- `title`
- `url`
- `publisher`
- `creator`
- `license_id`
- `license_url`
- `retrieved_at`
- `adapted`
- `attribution_text`
- `excluded_from_upstream_license`
- `exclusion_notes`
Use `examples/ocw-information-entropy/sources.yaml` as the concrete model.
## When to add excluded-source records
Add explicit excluded records when:
- the course page points to third-party figures or readings
- the page itself warns that a particular asset is excluded from the main course license
- you want the record preserved even though you do not reuse the asset
That is the route for acknowledging future sources that require special handling.
## Practical advice for course selection
Good first OCW candidates:
- courses with a strong week-by-week or unit-by-unit structure
- courses with stable textual descriptions, readings, or assignments
- courses where you can summarize the progression into a single local source file
Harder candidates:
- courses whose value is mostly in embedded media
- courses with many third-party handouts or linked readings
- courses with weak textual structure
## Current repo reference
The MIT OCW Information and Entropy demo is the reference implementation of this pattern:
- source file: `examples/ocw-information-entropy/6-050j-information-and-entropy.md`
- source inventory: `examples/ocw-information-entropy/sources.yaml`
- generated pack: `domain-packs/mit-ocw-information-entropy/`

View File

@ -7,7 +7,9 @@ MIT OpenCourseWare material is a good fit for Didactopus demos, but it needs exp
The MIT OCW Information and Entropy demo stores:
- a local derived source file in `examples/ocw-information-entropy/`
- a `sources.yaml` source inventory beside that file
- attribution and rights notes in the generated pack
- a generated `pack_compliance_manifest.json` in the generated pack
- generated learner outputs in `examples/ocw-information-entropy-run/`
- a repo-local skill bundle in `skills/ocw-information-entropy-agent/`
@ -27,6 +29,9 @@ That means Didactopus should:
When building from MIT OCW sources:
- record the course page and any unit/resource pages used
- keep those records in a per-course `sources.yaml` inventory
- separate core MIT OCW material from excluded third-party items if they appear
- keep generated pack content clearly marked as adapted/derived
- include attribution artifacts with the emitted pack
- include attribution and compliance artifacts with the emitted pack
For the full workflow, see `docs/mit-ocw-course-guide.md`.

View File

@ -0,0 +1,420 @@
concepts:
- id: mit-ocw-6-050j-information-and-entropy
title: MIT OCW 6.050J Information and Entropy
description: 'Source: MIT OpenCourseWare 6.050J Information and Entropy, Spring
2008.
Attribution: adapted from the OCW course overview, unit sequence, and assigned
textbook references.'
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: information
title: Information
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information
and Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: entropy
title: Entropy
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information
and Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: source
title: Source
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information
and Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: opencourseware
title: OpenCourseWare
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information
and Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: spring
title: Spring
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information
and Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: attribution
title: Attribution
description: Candidate concept extracted from lesson 'MIT OCW 6.050J Information
and Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: counting-and-probability
title: Counting and Probability
description: '- Objective: Explain how counting arguments, probability spaces, and
random variables support later information-theory results.
- Exercise: Derive a simple counting argument for binary strings and compute an
event probability.
This lesson i'
prerequisites:
- mit-ocw-6-050j-information-and-entropy
mastery_signals: []
mastery_profile: {}
- id: counting
title: Counting
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: probability
title: Probability
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: objective
title: Objective
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: explain
title: Explain
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: exercise
title: Exercise
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: derive
title: Derive
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: this
title: This
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: random
title: Random
description: Candidate concept extracted from lesson 'Counting and Probability'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: shannon-entropy
title: Shannon Entropy
description: '- Objective: Explain Shannon Entropy as a measure of uncertainty and
compare high-entropy and low-entropy sources.
- Exercise: Compute the entropy of a Bernoulli source and interpret the result.
This lesson centers Shannon Entropy, Surprise'
prerequisites:
- counting-and-probability
mastery_signals: []
mastery_profile: {}
- id: shannon
title: Shannon
description: Candidate concept extracted from lesson 'Shannon Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: compute
title: Compute
description: Candidate concept extracted from lesson 'Shannon Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: bernoulli
title: Bernoulli
description: Candidate concept extracted from lesson 'Shannon Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: mutual-information
title: Mutual Information
description: '- Objective: Explain Mutual Information and relate it to dependence
between signals.
- Exercise: Compare independent variables with dependent variables using mutual-information
reasoning.
This lesson introduces Mutual Information, Dependenc'
prerequisites:
- shannon-entropy
mastery_signals: []
mastery_profile: {}
- id: mutual
title: Mutual
description: Candidate concept extracted from lesson 'Mutual Information'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: compare
title: Compare
description: Candidate concept extracted from lesson 'Mutual Information'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: dependence
title: Dependence
description: Candidate concept extracted from lesson 'Mutual Information'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: data-compression
title: Data Compression
description: '- Objective: Explain lossless compression in terms of entropy and
typical structure.
- Exercise: Describe when compression succeeds and when it fails on already-random
data.
This lesson covers Data Compression, Redundancy, and Efficient Rep'
prerequisites:
- mutual-information
mastery_signals: []
mastery_profile: {}
- id: data
title: Data
description: Candidate concept extracted from lesson 'Data Compression'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: compression
title: Compression
description: Candidate concept extracted from lesson 'Data Compression'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: describe
title: Describe
description: Candidate concept extracted from lesson 'Data Compression'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: redundancy
title: Redundancy
description: Candidate concept extracted from lesson 'Data Compression'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: huffman-coding
title: Huffman Coding
description: '- Objective: Explain Huffman Coding and justify why shorter codewords
should track more likely symbols.
- Exercise: Build a Huffman code for a small source alphabet.
This lesson focuses on Huffman Coding, Prefix Codes, and Expected Length.'
prerequisites:
- data-compression
mastery_signals: []
mastery_profile: {}
- id: huffman
title: Huffman
description: Candidate concept extracted from lesson 'Huffman Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: coding
title: Coding
description: Candidate concept extracted from lesson 'Huffman Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: build
title: Build
description: Candidate concept extracted from lesson 'Huffman Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: prefix
title: Prefix
description: Candidate concept extracted from lesson 'Huffman Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: channel-capacity
title: Channel Capacity
description: '- Objective: Explain Channel Capacity as a limit on reliable communication
over noisy channels.
- Exercise: State why reliable transmission above capacity is impossible in the
long run.
This lesson develops Channel Capacity, Reliable Commun'
prerequisites:
- huffman-coding
mastery_signals: []
mastery_profile: {}
- id: channel
title: Channel
description: Candidate concept extracted from lesson 'Channel Capacity'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: capacity
title: Capacity
description: Candidate concept extracted from lesson 'Channel Capacity'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: state
title: State
description: Candidate concept extracted from lesson 'Channel Capacity'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: reliable
title: Reliable
description: Candidate concept extracted from lesson 'Channel Capacity'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: channel-coding
title: Channel Coding
description: '- Objective: Explain how Channel Coding adds structure that protects
messages against noise.
- Exercise: Contrast uncoded transmission with coded transmission on a noisy channel.
This lesson connects Channel Coding, Decoding, and Reliabilit'
prerequisites:
- channel-capacity
mastery_signals: []
mastery_profile: {}
- id: contrast
title: Contrast
description: Candidate concept extracted from lesson 'Channel Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: decoding
title: Decoding
description: Candidate concept extracted from lesson 'Channel Coding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: error-correcting-codes
title: Error Correcting Codes
description: '- Objective: Explain how Error Correcting Codes detect or correct
symbol corruption.
- Exercise: Describe a simple parity-style code and its limits.
This lesson covers Error Correcting Codes, Parity, and Syndrome-style reasoning.
The learne'
prerequisites:
- channel-coding
mastery_signals: []
mastery_profile: {}
- id: error
title: Error
description: Candidate concept extracted from lesson 'Error Correcting Codes'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: correcting
title: Correcting
description: Candidate concept extracted from lesson 'Error Correcting Codes'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: codes
title: Codes
description: Candidate concept extracted from lesson 'Error Correcting Codes'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: cryptography-and-information-hiding
title: Cryptography and Information Hiding
description: '- Objective: Explain the relationship between secrecy, information
leakage, and coded communication.
- Exercise: Compare a secure scheme with a weak one in terms of revealed information.
This lesson combines Cryptography, Information Leakag'
prerequisites:
- error-correcting-codes
mastery_signals: []
mastery_profile: {}
- id: cryptography
title: Cryptography
description: Candidate concept extracted from lesson 'Cryptography and Information
Hiding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: hiding
title: Hiding
description: Candidate concept extracted from lesson 'Cryptography and Information
Hiding'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: thermodynamics-and-entropy
title: Thermodynamics and Entropy
description: '- Objective: Explain how thermodynamic entropy relates to, and differs
from, Shannon entropy.
- Exercise: Compare the two entropy notions and identify what is preserved across
the analogy.
This lesson connects Thermodynamics, Entropy, and P'
prerequisites:
- cryptography-and-information-hiding
mastery_signals: []
mastery_profile: {}
- id: thermodynamics
title: Thermodynamics
description: Candidate concept extracted from lesson 'Thermodynamics and Entropy'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: course-synthesis
title: Course Synthesis
description: '- Objective: Synthesize the course by connecting entropy, coding,
reliability, and physical interpretation in one coherent narrative.
- Exercise: Produce a final study guide that links source coding, channel coding,
secrecy, and thermodynam'
prerequisites:
- thermodynamics-and-entropy
mastery_signals: []
mastery_profile: {}
- id: course
title: Course
description: Candidate concept extracted from lesson 'Course Synthesis'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: synthesis
title: Synthesis
description: Candidate concept extracted from lesson 'Course Synthesis'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: synthesize
title: Synthesize
description: Candidate concept extracted from lesson 'Course Synthesis'.
prerequisites: []
mastery_signals: []
mastery_profile: {}
- id: produce
title: Produce
description: Candidate concept extracted from lesson 'Course Synthesis'.
prerequisites: []
mastery_signals: []
mastery_profile: {}

View File

@ -0,0 +1,3 @@
# Conflict Report
- none

View File

@ -0,0 +1,10 @@
{
"rights_note": "Derived from MIT OpenCourseWare 6.050J Information and Entropy (Spring 2008). Retain MIT OCW attribution and applicable Creative Commons terms before redistribution.",
"sources": [
{
"source_path": "/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy/6-050j-information-and-entropy.md",
"source_type": "markdown",
"title": "6 050J Information And Entropy"
}
]
}

View File

@ -0,0 +1,14 @@
name: mit-ocw-information-and-entropy
display_name: MIT OCW Information and Entropy
version: 0.1.0-draft
schema_version: '1'
didactopus_min_version: 0.1.0
didactopus_max_version: 0.9.99
description: Draft topic pack generated from multi-course inputs for 'MIT OCW Information
and Entropy'.
author: MIT OCW derived demo
license: CC BY-NC-SA 4.0
dependencies: []
overrides: []
profile_templates: {}
cross_pack_links: []

View File

@ -0,0 +1,20 @@
{
"pack_id": "mit-ocw-information-and-entropy",
"display_name": "MIT OCW Information and Entropy",
"derived_from_sources": [
"mit-ocw-6-050j-course-home",
"mit-ocw-6-050j-unit-8-textbook",
"mit-ocw-6-050j-unit-13-textbook"
],
"attribution_required": true,
"share_alike_required": true,
"noncommercial_only": true,
"restrictive_flags": [
"share-alike",
"noncommercial"
],
"redistribution_notes": [
"Derived redistributable material may need to remain under the same license family.",
"Derived redistributable material may be limited to noncommercial use."
]
}

View File

@ -0,0 +1 @@
projects: []

View File

@ -0,0 +1,59 @@
# Review Report
- Module 'Imported from MARKDOWN' has no explicit exercises; mastery signals may be weak.
- Concept 'MIT OCW 6.050J Information and Entropy' has no extracted mastery signals; review manually.
- Concept 'Information' has no extracted mastery signals; review manually.
- Concept 'Entropy' has no extracted mastery signals; review manually.
- Concept 'Source' has no extracted mastery signals; review manually.
- Concept 'OpenCourseWare' has no extracted mastery signals; review manually.
- Concept 'Spring' has no extracted mastery signals; review manually.
- Concept 'Attribution' has no extracted mastery signals; review manually.
- Concept 'Counting and Probability' has no extracted mastery signals; review manually.
- Concept 'Counting' has no extracted mastery signals; review manually.
- Concept 'Probability' has no extracted mastery signals; review manually.
- Concept 'Objective' has no extracted mastery signals; review manually.
- Concept 'Explain' has no extracted mastery signals; review manually.
- Concept 'Exercise' has no extracted mastery signals; review manually.
- Concept 'Derive' has no extracted mastery signals; review manually.
- Concept 'This' has no extracted mastery signals; review manually.
- Concept 'Random' has no extracted mastery signals; review manually.
- Concept 'Shannon Entropy' has no extracted mastery signals; review manually.
- Concept 'Shannon' has no extracted mastery signals; review manually.
- Concept 'Compute' has no extracted mastery signals; review manually.
- Concept 'Bernoulli' has no extracted mastery signals; review manually.
- Concept 'Mutual Information' has no extracted mastery signals; review manually.
- Concept 'Mutual' has no extracted mastery signals; review manually.
- Concept 'Compare' has no extracted mastery signals; review manually.
- Concept 'Dependence' has no extracted mastery signals; review manually.
- Concept 'Data Compression' has no extracted mastery signals; review manually.
- Concept 'Data' has no extracted mastery signals; review manually.
- Concept 'Compression' has no extracted mastery signals; review manually.
- Concept 'Describe' has no extracted mastery signals; review manually.
- Concept 'Redundancy' has no extracted mastery signals; review manually.
- Concept 'Huffman Coding' has no extracted mastery signals; review manually.
- Concept 'Huffman' has no extracted mastery signals; review manually.
- Concept 'Coding' has no extracted mastery signals; review manually.
- Concept 'Build' has no extracted mastery signals; review manually.
- Concept 'Prefix' has no extracted mastery signals; review manually.
- Concept 'Channel Capacity' has no extracted mastery signals; review manually.
- Concept 'Channel' has no extracted mastery signals; review manually.
- Concept 'Capacity' has no extracted mastery signals; review manually.
- Concept 'State' has no extracted mastery signals; review manually.
- Concept 'Reliable' has no extracted mastery signals; review manually.
- Concept 'Channel Coding' has no extracted mastery signals; review manually.
- Concept 'Contrast' has no extracted mastery signals; review manually.
- Concept 'Decoding' has no extracted mastery signals; review manually.
- Concept 'Error Correcting Codes' has no extracted mastery signals; review manually.
- Concept 'Error' has no extracted mastery signals; review manually.
- Concept 'Correcting' has no extracted mastery signals; review manually.
- Concept 'Codes' has no extracted mastery signals; review manually.
- Concept 'Cryptography and Information Hiding' has no extracted mastery signals; review manually.
- Concept 'Cryptography' has no extracted mastery signals; review manually.
- Concept 'Hiding' has no extracted mastery signals; review manually.
- Concept 'Thermodynamics and Entropy' has no extracted mastery signals; review manually.
- Concept 'Thermodynamics' has no extracted mastery signals; review manually.
- Concept 'Course Synthesis' has no extracted mastery signals; review manually.
- Concept 'Course' has no extracted mastery signals; review manually.
- Concept 'Synthesis' has no extracted mastery signals; review manually.
- Concept 'Synthesize' has no extracted mastery signals; review manually.
- Concept 'Produce' has no extracted mastery signals; review manually.

View File

@ -0,0 +1,17 @@
stages:
- id: stage-1
title: Imported from MARKDOWN
concepts:
- mit-ocw-6-050j-information-and-entropy
- counting-and-probability
- shannon-entropy
- mutual-information
- data-compression
- huffman-coding
- channel-capacity
- channel-coding
- error-correcting-codes
- cryptography-and-information-hiding
- thermodynamics-and-entropy
- course-synthesis
checkpoint: []

View File

@ -0,0 +1,6 @@
rubrics:
- id: draft-rubric
title: Draft Rubric
criteria:
- correctness
- explanation

View File

@ -0,0 +1,39 @@
sources:
- source_id: mit-ocw-6-050j-course-home
title: MIT OpenCourseWare 6.050J Information and Entropy course home
url: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/
publisher: Massachusetts Institute of Technology
creator: MIT OpenCourseWare
license_id: CC BY-NC-SA 4.0
license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/
retrieved_at: "2026-03-14"
adapted: true
attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0.
excluded_from_upstream_license: false
exclusion_notes: ""
- source_id: mit-ocw-6-050j-unit-8-textbook
title: MIT OpenCourseWare 6.050J Information and Entropy Unit 8 textbook/resource page
url: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/resources/mit6_050js08_textbook_1/
publisher: Massachusetts Institute of Technology
creator: MIT OpenCourseWare
license_id: CC BY-NC-SA 4.0
license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/
retrieved_at: "2026-03-14"
adapted: true
attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0.
excluded_from_upstream_license: false
exclusion_notes: ""
- source_id: mit-ocw-6-050j-unit-13-textbook
title: MIT OpenCourseWare 6.050J Information and Entropy Unit 13 textbook/resource page
url: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/resources/mit6_050js08_textbook_2/
publisher: Massachusetts Institute of Technology
creator: MIT OpenCourseWare
license_id: CC BY-NC-SA 4.0
license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/
retrieved_at: "2026-03-14"
adapted: true
attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0.
excluded_from_upstream_license: false
exclusion_notes: ""

View File

@ -2,6 +2,7 @@
"course_source": "/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy/6-050j-information-and-entropy.md",
"pack_dir": "/home/netuser/dev/Didactopustry1/domain-packs/mit-ocw-information-entropy",
"skill_dir": "/home/netuser/dev/Didactopustry1/skills/ocw-information-entropy-agent",
"source_inventory": "/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy/sources.yaml",
"review_flags": [
"Module 'Imported from MARKDOWN' has no explicit exercises; mastery signals may be weak.",
"Concept 'MIT OCW 6.050J Information and Entropy' has no extracted mastery signals; review manually.",
@ -89,5 +90,26 @@
"mit-ocw-information-and-entropy::shannon-entropy",
"mit-ocw-information-and-entropy::thermodynamics-and-entropy"
],
"artifact_count": 11
"artifact_count": 11,
"compliance_manifest": "/home/netuser/dev/Didactopustry1/domain-packs/mit-ocw-information-entropy/pack_compliance_manifest.json",
"compliance": {
"pack_id": "mit-ocw-information-and-entropy",
"display_name": "MIT OCW Information and Entropy",
"derived_from_sources": [
"mit-ocw-6-050j-course-home",
"mit-ocw-6-050j-unit-8-textbook",
"mit-ocw-6-050j-unit-13-textbook"
],
"attribution_required": true,
"share_alike_required": true,
"noncommercial_only": true,
"restrictive_flags": [
"share-alike",
"noncommercial"
],
"redistribution_notes": [
"Derived redistributable material may need to remain under the same license family.",
"Derived redistributable material may be limited to noncommercial use."
]
}
}

View File

@ -0,0 +1,39 @@
sources:
- source_id: mit-ocw-6-050j-course-home
title: MIT OpenCourseWare 6.050J Information and Entropy course home
url: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/
publisher: Massachusetts Institute of Technology
creator: MIT OpenCourseWare
license_id: CC BY-NC-SA 4.0
license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/
retrieved_at: "2026-03-14"
adapted: true
attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0.
excluded_from_upstream_license: false
exclusion_notes: ""
- source_id: mit-ocw-6-050j-unit-8-textbook
title: MIT OpenCourseWare 6.050J Information and Entropy Unit 8 textbook/resource page
url: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/resources/mit6_050js08_textbook_1/
publisher: Massachusetts Institute of Technology
creator: MIT OpenCourseWare
license_id: CC BY-NC-SA 4.0
license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/
retrieved_at: "2026-03-14"
adapted: true
attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0.
excluded_from_upstream_license: false
exclusion_notes: ""
- source_id: mit-ocw-6-050j-unit-13-textbook
title: MIT OpenCourseWare 6.050J Information and Entropy Unit 13 textbook/resource page
url: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/resources/mit6_050js08_textbook_2/
publisher: Massachusetts Institute of Technology
creator: MIT OpenCourseWare
license_id: CC BY-NC-SA 4.0
license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/
retrieved_at: "2026-03-14"
adapted: true
attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0.
excluded_from_upstream_license: false
exclusion_notes: ""

View File

@ -0,0 +1,20 @@
{
"pack_id": "mit-ocw-information-and-entropy",
"display_name": "MIT OCW Information and Entropy",
"derived_from_sources": [
"mit-ocw-6-050j-course-home",
"mit-ocw-6-050j-unit-8-textbook",
"mit-ocw-6-050j-unit-13-textbook"
],
"attribution_required": true,
"share_alike_required": true,
"noncommercial_only": true,
"restrictive_flags": [
"share-alike",
"noncommercial"
],
"redistribution_notes": [
"Derived redistributable material may need to remain under the same license family.",
"Derived redistributable material may be limited to noncommercial use."
]
}

View File

@ -0,0 +1,39 @@
sources:
- source_id: mit-ocw-6-050j-course-home
title: MIT OpenCourseWare 6.050J Information and Entropy course home
url: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/
publisher: Massachusetts Institute of Technology
creator: MIT OpenCourseWare
license_id: CC BY-NC-SA 4.0
license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/
retrieved_at: "2026-03-14"
adapted: true
attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0.
excluded_from_upstream_license: false
exclusion_notes: ""
- source_id: mit-ocw-6-050j-unit-8-textbook
title: MIT OpenCourseWare 6.050J Information and Entropy Unit 8 textbook/resource page
url: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/resources/mit6_050js08_textbook_1/
publisher: Massachusetts Institute of Technology
creator: MIT OpenCourseWare
license_id: CC BY-NC-SA 4.0
license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/
retrieved_at: "2026-03-14"
adapted: true
attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0.
excluded_from_upstream_license: false
exclusion_notes: ""
- source_id: mit-ocw-6-050j-unit-13-textbook
title: MIT OpenCourseWare 6.050J Information and Entropy Unit 13 textbook/resource page
url: https://ocw.mit.edu/courses/6-050j-information-and-entropy-spring-2008/resources/mit6_050js08_textbook_2/
publisher: Massachusetts Institute of Technology
creator: MIT OpenCourseWare
license_id: CC BY-NC-SA 4.0
license_url: https://creativecommons.org/licenses/by-nc-sa/4.0/
retrieved_at: "2026-03-14"
adapted: true
attribution_text: Derived in part from MIT OpenCourseWare 6.050J Information and Entropy course materials used under CC BY-NC-SA 4.0.
excluded_from_upstream_license: false
exclusion_notes: ""

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 6.9 KiB

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 13 KiB

View File

@ -2,6 +2,7 @@
"course_source": "/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy/6-050j-information-and-entropy.md",
"pack_dir": "/home/netuser/dev/Didactopustry1/domain-packs/mit-ocw-information-entropy",
"skill_dir": "/home/netuser/dev/Didactopustry1/skills/ocw-information-entropy-agent",
"source_inventory": "/home/netuser/dev/Didactopustry1/examples/ocw-information-entropy/sources.yaml",
"review_flags": [
"Module 'Imported from MARKDOWN' has no explicit exercises; mastery signals may be weak.",
"Concept 'MIT OCW 6.050J Information and Entropy' has no extracted mastery signals; review manually.",
@ -89,5 +90,26 @@
"mit-ocw-information-and-entropy::shannon-entropy",
"mit-ocw-information-and-entropy::thermodynamics-and-entropy"
],
"artifact_count": 11
"artifact_count": 11,
"compliance_manifest": "/home/netuser/dev/Didactopustry1/domain-packs/mit-ocw-information-entropy/pack_compliance_manifest.json",
"compliance": {
"pack_id": "mit-ocw-information-and-entropy",
"display_name": "MIT OCW Information and Entropy",
"derived_from_sources": [
"mit-ocw-6-050j-course-home",
"mit-ocw-6-050j-unit-8-textbook",
"mit-ocw-6-050j-unit-13-textbook"
],
"attribution_required": true,
"share_alike_required": true,
"noncommercial_only": true,
"restrictive_flags": [
"share-alike",
"noncommercial"
],
"redistribution_notes": [
"Derived redistributable material may need to remain under the same license family.",
"Derived redistributable material may be limited to noncommercial use."
]
}
}

View File

@ -5,6 +5,7 @@ from pathlib import Path
from .agentic_loop import AgenticStudentState, integrate_attempt
from .artifact_registry import validate_pack
from .course_ingestion_compliance import build_pack_compliance_manifest, load_sources, write_manifest
from .document_adapters import adapt_document
from .evaluator_pipeline import LearnerAttempt
from .graph_builder import build_concept_graph
@ -123,11 +124,13 @@ def _write_skill_bundle(
def run_ocw_information_entropy_demo(
course_source: str | Path,
source_inventory: str | Path,
pack_dir: str | Path,
run_dir: str | Path,
skill_dir: str | Path,
) -> dict:
course_source = Path(course_source)
source_inventory = Path(source_inventory)
pack_dir = Path(pack_dir)
run_dir = Path(run_dir)
skill_dir = Path(skill_dir)
@ -150,6 +153,11 @@ def run_ocw_information_entropy_demo(
conflicts=[],
)
write_draft_pack(draft, pack_dir)
if source_inventory.exists():
inventory = load_sources(source_inventory)
compliance_manifest = build_pack_compliance_manifest(draft.pack["name"], draft.pack["display_name"], inventory)
write_manifest(compliance_manifest, pack_dir / "pack_compliance_manifest.json")
(pack_dir / "source_inventory.yaml").write_text(source_inventory.read_text(encoding="utf-8"), encoding="utf-8")
validation = validate_pack(pack_dir)
if not validation.is_valid:
@ -183,6 +191,7 @@ def run_ocw_information_entropy_demo(
"course_source": str(course_source),
"pack_dir": str(pack_dir),
"skill_dir": str(skill_dir),
"source_inventory": str(source_inventory),
"review_flags": list(ctx.review_flags),
"concept_count": len(ctx.concepts),
"target_concept": target_key,
@ -190,6 +199,10 @@ def run_ocw_information_entropy_demo(
"mastered_concepts": sorted(state.mastered_concepts),
"artifact_count": len(state.artifacts),
}
compliance_path = pack_dir / "pack_compliance_manifest.json"
if compliance_path.exists():
summary["compliance_manifest"] = str(compliance_path)
summary["compliance"] = json.loads(compliance_path.read_text(encoding="utf-8"))
(run_dir / "run_summary.json").write_text(json.dumps(summary, indent=2), encoding="utf-8")
_write_skill_bundle(skill_dir, pack_dir, run_dir, concept_path, summary["mastered_concepts"])
@ -205,6 +218,10 @@ def main() -> None:
"--course-source",
default=str(root / "examples" / "ocw-information-entropy" / "6-050j-information-and-entropy.md"),
)
parser.add_argument(
"--source-inventory",
default=str(root / "examples" / "ocw-information-entropy" / "sources.yaml"),
)
parser.add_argument(
"--pack-dir",
default=str(root / "domain-packs" / "mit-ocw-information-entropy"),
@ -221,6 +238,7 @@ def main() -> None:
summary = run_ocw_information_entropy_demo(
course_source=args.course_source,
source_inventory=args.source_inventory,
pack_dir=args.pack_dir,
run_dir=args.run_dir,
skill_dir=args.skill_dir,

View File

@ -28,4 +28,4 @@ def test_dependency_resolution() -> None:
results = discover_domain_packs(["domain-packs"])
errors = check_pack_dependencies(results)
assert any("depends on missing pack 'nonexistent-pack'" in err for err in errors)
assert not any("bayes-extension" in err for err in errors and "foundations-statistics" in err)
assert not any("bayes-extension" in err and "foundations-statistics" in err for err in errors)

View File

@ -7,12 +7,14 @@ def test_ocw_information_entropy_demo_generates_pack_and_skill(tmp_path: Path) -
root = Path(__file__).resolve().parents[1]
summary = run_ocw_information_entropy_demo(
course_source=root / "examples" / "ocw-information-entropy" / "6-050j-information-and-entropy.md",
source_inventory=root / "examples" / "ocw-information-entropy" / "sources.yaml",
pack_dir=tmp_path / "pack",
run_dir=tmp_path / "run",
skill_dir=tmp_path / "skill",
)
assert (tmp_path / "pack" / "pack.yaml").exists()
assert (tmp_path / "pack" / "pack_compliance_manifest.json").exists()
assert (tmp_path / "run" / "capability_profile.json").exists()
assert (tmp_path / "skill" / "SKILL.md").exists()
assert summary["target_concept"].endswith("thermodynamics-and-entropy")

View File

@ -0,0 +1,33 @@
from pathlib import Path
from didactopus.ocw_skill_agent_demo import (
evaluate_submission_with_skill,
load_ocw_skill_context,
run_ocw_skill_agent_demo,
)
def test_run_ocw_skill_agent_demo(tmp_path: Path) -> None:
root = Path(__file__).resolve().parents[1]
payload = run_ocw_skill_agent_demo(
root / "skills" / "ocw-information-entropy-agent",
tmp_path,
)
assert (tmp_path / "skill_demo.json").exists()
assert (tmp_path / "skill_demo.md").exists()
assert payload["study_plan"]["steps"]
assert payload["evaluation"]["verdict"] in {"acceptable", "needs_revision"}
def test_skill_demo_flags_weak_submission() -> None:
root = Path(__file__).resolve().parents[1]
context = load_ocw_skill_context(root / "skills" / "ocw-information-entropy-agent")
result = evaluate_submission_with_skill(
context,
"channel-capacity",
"Channel capacity is good.",
)
assert result["verdict"] == "needs_revision"
assert "Rework the answer" in result["follow_up"]