├── docs ├── .nojekyll ├── index.md ├── chapter1 │ ├── res │ │ ├── chapter1-1.png │ │ ├── chapter1-10.png │ │ ├── chapter1-11.png │ │ ├── chapter1-12.png │ │ ├── chapter1-13.png │ │ ├── chapter1-14.png │ │ ├── chapter1-15.png │ │ ├── chapter1-16.png │ │ ├── chapter1-17.png │ │ ├── chapter1-18.png │ │ ├── chapter1-2.png │ │ ├── chapter1-3.png │ │ ├── chapter1-4.png │ │ ├── chapter1-5.png │ │ ├── chapter1-6.png │ │ ├── chapter1-7.png │ │ ├── chapter1-8.png │ │ └── chapter1-9.png │ └── chapter1.md ├── chapter2 │ ├── res │ │ ├── chapter2-1.png │ │ ├── chapter2-2.png │ │ ├── chapter2-3.png │ │ ├── chapter2-4.png │ │ ├── chapter2-5.png │ │ ├── chapter2-6.png │ │ ├── chapter2-7.png │ │ ├── chapter2-8.png │ │ └── chapter2-9.png │ └── chapter2.md ├── chapter3 │ ├── res │ │ ├── chapter3-1.png │ │ ├── chapter3-2.png │ │ ├── chapter3-3.png │ │ ├── chapter3-4.png │ │ ├── chapter3-5.png │ │ ├── chapter3-6.png │ │ ├── chapter3-7.png │ │ ├── chapter3-8.png │ │ └── chapter3-9.png │ └── chapter3.md ├── chapter4 │ ├── res │ │ ├── chapter4-1.png │ │ ├── chapter4-2.png │ │ ├── chapter4-3.png │ │ ├── chapter4-4.png │ │ ├── chapter4-5.png │ │ ├── chapter4-6.png │ │ ├── chapter4-7.png │ │ ├── chapter4-8.png │ │ └── chapter4-9.png │ └── chapter4.md ├── chapter5 │ ├── res │ │ ├── chapter5-1.png │ │ ├── chapter5-10.png │ │ ├── chapter5-11.png │ │ ├── chapter5-12.png │ │ ├── chapter5-2.png │ │ ├── chapter5-3.png │ │ ├── chapter5-4.png │ │ ├── chapter5-5.png │ │ ├── chapter5-6.png │ │ ├── chapter5-7.png │ │ ├── chapter5-8.png │ │ └── chapter5-9.png │ └── chapter5.md ├── chapter6 │ ├── res │ │ ├── chapter6-1.png │ │ ├── chapter6-10.png │ │ ├── chapter6-11.png │ │ ├── chapter6-12.png │ │ ├── chapter6-13.png │ │ ├── chapter6-14.png │ │ ├── chapter6-15.png │ │ ├── chapter6-16.png │ │ ├── chapter6-17.png │ │ ├── chapter6-18.png │ │ ├── chapter6-19.png │ │ ├── chapter6-2.png │ │ ├── chapter6-20.png │ │ ├── chapter6-21.png │ │ ├── chapter6-22.png │ │ ├── chapter6-23.png │ │ ├── chapter6-24.png │ │ ├── chapter6-25.png │ │ ├── chapter6-4.png │ │ ├── chapter6-5.png │ │ ├── chapter6-6.png │ │ ├── chapter6-7.png │ │ ├── chapter6-8.png │ │ └── chapter6-9.png │ └── chapter6.md ├── chapter7 │ ├── res │ │ ├── chapter7-1.png │ │ ├── chapter7-10.png │ │ ├── chapter7-11.png │ │ ├── chapter7-12.png │ │ ├── chapter7-13.png │ │ ├── chapter7-14.png │ │ ├── chapter7-15.png │ │ ├── chapter7-16.png │ │ ├── chapter7-17.png │ │ ├── chapter7-18.png │ │ ├── chapter7-2.png │ │ ├── chapter7-3.png │ │ ├── chapter7-4.png │ │ ├── chapter7-5.png │ │ ├── chapter7-6.png │ │ ├── chapter7-7.png │ │ ├── chapter7-8.png │ │ └── chapter7-9.png │ └── chapter7.md ├── chapter8 │ ├── res │ │ ├── chapter8-1.png │ │ ├── chapter8-10.png │ │ ├── chapter8-11.png │ │ ├── chapter8-12.png │ │ ├── chapter8-13.png │ │ ├── chapter8-14.png │ │ ├── chapter8-15.png │ │ ├── chapter8-16.png │ │ ├── chapter8-17.png │ │ ├── chapter8-18.png │ │ ├── chapter8-19.png │ │ ├── chapter8-2.png │ │ ├── chapter8-20.png │ │ ├── chapter8-3.png │ │ ├── chapter8-4.png │ │ ├── chapter8-5.png │ │ ├── chapter8-6.png │ │ ├── chapter8-7.png │ │ ├── chapter8-8.png │ │ └── chapter8-9.png │ └── chapter8.md ├── README.md ├── _sidebar.md └── index.html ├── .gitignore ├── README.md └── LICENSE /docs/.nojekyll: -------------------------------------------------------------------------------- 1 | -------------------------------------------------------------------------------- /docs/index.md: -------------------------------------------------------------------------------- 1 | # DEMO index.md -------------------------------------------------------------------------------- /docs/chapter1/res/chapter1-1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter1/res/chapter1-1.png -------------------------------------------------------------------------------- /docs/chapter1/res/chapter1-10.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter1/res/chapter1-10.png -------------------------------------------------------------------------------- /docs/chapter1/res/chapter1-11.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter1/res/chapter1-11.png -------------------------------------------------------------------------------- /docs/chapter1/res/chapter1-12.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter1/res/chapter1-12.png -------------------------------------------------------------------------------- /docs/chapter1/res/chapter1-13.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter1/res/chapter1-13.png -------------------------------------------------------------------------------- /docs/chapter1/res/chapter1-14.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter1/res/chapter1-14.png -------------------------------------------------------------------------------- /docs/chapter1/res/chapter1-15.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter1/res/chapter1-15.png -------------------------------------------------------------------------------- /docs/chapter1/res/chapter1-16.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter1/res/chapter1-16.png -------------------------------------------------------------------------------- /docs/chapter1/res/chapter1-17.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter1/res/chapter1-17.png -------------------------------------------------------------------------------- /docs/chapter1/res/chapter1-18.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter1/res/chapter1-18.png -------------------------------------------------------------------------------- /docs/chapter1/res/chapter1-2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter1/res/chapter1-2.png -------------------------------------------------------------------------------- /docs/chapter1/res/chapter1-3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter1/res/chapter1-3.png -------------------------------------------------------------------------------- /docs/chapter1/res/chapter1-4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter1/res/chapter1-4.png -------------------------------------------------------------------------------- /docs/chapter1/res/chapter1-5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter1/res/chapter1-5.png -------------------------------------------------------------------------------- /docs/chapter1/res/chapter1-6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter1/res/chapter1-6.png -------------------------------------------------------------------------------- /docs/chapter1/res/chapter1-7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter1/res/chapter1-7.png -------------------------------------------------------------------------------- /docs/chapter1/res/chapter1-8.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter1/res/chapter1-8.png -------------------------------------------------------------------------------- /docs/chapter1/res/chapter1-9.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter1/res/chapter1-9.png -------------------------------------------------------------------------------- /docs/chapter2/res/chapter2-1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter2/res/chapter2-1.png -------------------------------------------------------------------------------- /docs/chapter2/res/chapter2-2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter2/res/chapter2-2.png -------------------------------------------------------------------------------- /docs/chapter2/res/chapter2-3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter2/res/chapter2-3.png -------------------------------------------------------------------------------- /docs/chapter2/res/chapter2-4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter2/res/chapter2-4.png -------------------------------------------------------------------------------- /docs/chapter2/res/chapter2-5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter2/res/chapter2-5.png -------------------------------------------------------------------------------- /docs/chapter2/res/chapter2-6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter2/res/chapter2-6.png -------------------------------------------------------------------------------- /docs/chapter2/res/chapter2-7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter2/res/chapter2-7.png -------------------------------------------------------------------------------- /docs/chapter2/res/chapter2-8.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter2/res/chapter2-8.png -------------------------------------------------------------------------------- /docs/chapter2/res/chapter2-9.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter2/res/chapter2-9.png -------------------------------------------------------------------------------- /docs/chapter3/res/chapter3-1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter3/res/chapter3-1.png -------------------------------------------------------------------------------- /docs/chapter3/res/chapter3-2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter3/res/chapter3-2.png -------------------------------------------------------------------------------- /docs/chapter3/res/chapter3-3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter3/res/chapter3-3.png -------------------------------------------------------------------------------- /docs/chapter3/res/chapter3-4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter3/res/chapter3-4.png -------------------------------------------------------------------------------- /docs/chapter3/res/chapter3-5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter3/res/chapter3-5.png -------------------------------------------------------------------------------- /docs/chapter3/res/chapter3-6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter3/res/chapter3-6.png -------------------------------------------------------------------------------- /docs/chapter3/res/chapter3-7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter3/res/chapter3-7.png -------------------------------------------------------------------------------- /docs/chapter3/res/chapter3-8.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter3/res/chapter3-8.png -------------------------------------------------------------------------------- /docs/chapter3/res/chapter3-9.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter3/res/chapter3-9.png -------------------------------------------------------------------------------- /docs/chapter4/res/chapter4-1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter4/res/chapter4-1.png -------------------------------------------------------------------------------- /docs/chapter4/res/chapter4-2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter4/res/chapter4-2.png -------------------------------------------------------------------------------- /docs/chapter4/res/chapter4-3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter4/res/chapter4-3.png -------------------------------------------------------------------------------- /docs/chapter4/res/chapter4-4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter4/res/chapter4-4.png -------------------------------------------------------------------------------- /docs/chapter4/res/chapter4-5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter4/res/chapter4-5.png -------------------------------------------------------------------------------- /docs/chapter4/res/chapter4-6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter4/res/chapter4-6.png -------------------------------------------------------------------------------- /docs/chapter4/res/chapter4-7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter4/res/chapter4-7.png -------------------------------------------------------------------------------- /docs/chapter4/res/chapter4-8.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter4/res/chapter4-8.png -------------------------------------------------------------------------------- /docs/chapter4/res/chapter4-9.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter4/res/chapter4-9.png -------------------------------------------------------------------------------- /docs/chapter5/res/chapter5-1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter5/res/chapter5-1.png -------------------------------------------------------------------------------- /docs/chapter5/res/chapter5-10.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter5/res/chapter5-10.png -------------------------------------------------------------------------------- /docs/chapter5/res/chapter5-11.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter5/res/chapter5-11.png -------------------------------------------------------------------------------- /docs/chapter5/res/chapter5-12.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter5/res/chapter5-12.png -------------------------------------------------------------------------------- /docs/chapter5/res/chapter5-2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter5/res/chapter5-2.png -------------------------------------------------------------------------------- /docs/chapter5/res/chapter5-3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter5/res/chapter5-3.png -------------------------------------------------------------------------------- /docs/chapter5/res/chapter5-4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter5/res/chapter5-4.png -------------------------------------------------------------------------------- /docs/chapter5/res/chapter5-5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter5/res/chapter5-5.png -------------------------------------------------------------------------------- /docs/chapter5/res/chapter5-6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter5/res/chapter5-6.png -------------------------------------------------------------------------------- /docs/chapter5/res/chapter5-7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter5/res/chapter5-7.png -------------------------------------------------------------------------------- /docs/chapter5/res/chapter5-8.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter5/res/chapter5-8.png -------------------------------------------------------------------------------- /docs/chapter5/res/chapter5-9.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter5/res/chapter5-9.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-1.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-10.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-10.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-11.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-11.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-12.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-12.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-13.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-13.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-14.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-14.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-15.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-15.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-16.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-16.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-17.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-17.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-18.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-18.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-19.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-19.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-2.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-20.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-20.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-21.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-21.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-22.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-22.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-23.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-23.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-24.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-24.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-25.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-25.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-4.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-5.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-6.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-7.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-8.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-8.png -------------------------------------------------------------------------------- /docs/chapter6/res/chapter6-9.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter6/res/chapter6-9.png -------------------------------------------------------------------------------- /docs/chapter7/res/chapter7-1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter7/res/chapter7-1.png -------------------------------------------------------------------------------- /docs/chapter7/res/chapter7-10.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter7/res/chapter7-10.png -------------------------------------------------------------------------------- /docs/chapter7/res/chapter7-11.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter7/res/chapter7-11.png -------------------------------------------------------------------------------- /docs/chapter7/res/chapter7-12.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter7/res/chapter7-12.png -------------------------------------------------------------------------------- /docs/chapter7/res/chapter7-13.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter7/res/chapter7-13.png -------------------------------------------------------------------------------- /docs/chapter7/res/chapter7-14.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter7/res/chapter7-14.png -------------------------------------------------------------------------------- /docs/chapter7/res/chapter7-15.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter7/res/chapter7-15.png -------------------------------------------------------------------------------- /docs/chapter7/res/chapter7-16.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter7/res/chapter7-16.png -------------------------------------------------------------------------------- /docs/chapter7/res/chapter7-17.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter7/res/chapter7-17.png -------------------------------------------------------------------------------- /docs/chapter7/res/chapter7-18.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter7/res/chapter7-18.png -------------------------------------------------------------------------------- /docs/chapter7/res/chapter7-2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter7/res/chapter7-2.png -------------------------------------------------------------------------------- /docs/chapter7/res/chapter7-3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter7/res/chapter7-3.png -------------------------------------------------------------------------------- /docs/chapter7/res/chapter7-4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter7/res/chapter7-4.png -------------------------------------------------------------------------------- /docs/chapter7/res/chapter7-5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter7/res/chapter7-5.png -------------------------------------------------------------------------------- /docs/chapter7/res/chapter7-6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter7/res/chapter7-6.png -------------------------------------------------------------------------------- /docs/chapter7/res/chapter7-7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter7/res/chapter7-7.png -------------------------------------------------------------------------------- /docs/chapter7/res/chapter7-8.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter7/res/chapter7-8.png -------------------------------------------------------------------------------- /docs/chapter7/res/chapter7-9.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter7/res/chapter7-9.png -------------------------------------------------------------------------------- /docs/chapter8/res/chapter8-1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter8/res/chapter8-1.png -------------------------------------------------------------------------------- /docs/chapter8/res/chapter8-10.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter8/res/chapter8-10.png -------------------------------------------------------------------------------- /docs/chapter8/res/chapter8-11.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter8/res/chapter8-11.png -------------------------------------------------------------------------------- /docs/chapter8/res/chapter8-12.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter8/res/chapter8-12.png -------------------------------------------------------------------------------- /docs/chapter8/res/chapter8-13.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter8/res/chapter8-13.png -------------------------------------------------------------------------------- /docs/chapter8/res/chapter8-14.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter8/res/chapter8-14.png -------------------------------------------------------------------------------- /docs/chapter8/res/chapter8-15.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter8/res/chapter8-15.png -------------------------------------------------------------------------------- /docs/chapter8/res/chapter8-16.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter8/res/chapter8-16.png -------------------------------------------------------------------------------- /docs/chapter8/res/chapter8-17.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter8/res/chapter8-17.png -------------------------------------------------------------------------------- /docs/chapter8/res/chapter8-18.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter8/res/chapter8-18.png -------------------------------------------------------------------------------- /docs/chapter8/res/chapter8-19.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter8/res/chapter8-19.png -------------------------------------------------------------------------------- /docs/chapter8/res/chapter8-2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter8/res/chapter8-2.png -------------------------------------------------------------------------------- /docs/chapter8/res/chapter8-20.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter8/res/chapter8-20.png -------------------------------------------------------------------------------- /docs/chapter8/res/chapter8-3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter8/res/chapter8-3.png -------------------------------------------------------------------------------- /docs/chapter8/res/chapter8-4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter8/res/chapter8-4.png -------------------------------------------------------------------------------- /docs/chapter8/res/chapter8-5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter8/res/chapter8-5.png -------------------------------------------------------------------------------- /docs/chapter8/res/chapter8-6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter8/res/chapter8-6.png -------------------------------------------------------------------------------- /docs/chapter8/res/chapter8-7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter8/res/chapter8-7.png -------------------------------------------------------------------------------- /docs/chapter8/res/chapter8-8.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter8/res/chapter8-8.png -------------------------------------------------------------------------------- /docs/chapter8/res/chapter8-9.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ai-cloud-kubernetes/leela-notes/HEAD/docs/chapter8/res/chapter8-9.png -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # Node rules: 2 | ## Grunt intermediate storage (http://gruntjs.com/creating-plugins#storing-task-files) 3 | .grunt 4 | 5 | ## Dependency directory 6 | ## Commenting this out is preferred by some people, see 7 | ## https://docs.npmjs.com/misc/faq#should-i-check-my-node_modules-folder-into-git 8 | node_modules 9 | 10 | # Book build output 11 | _book 12 | 13 | # eBook build output 14 | *.epub 15 | *.mobi 16 | *.pdf 17 | -------------------------------------------------------------------------------- /docs/README.md: -------------------------------------------------------------------------------- 1 | # 李宏毅线性代数笔记(Leela-Notes) 2 | 李宏毅老师的[线性代数视频](https://www.bilibili.com/video/av64160249?from=search&seid=3301869198468514506)是线性代数领域经典的中文视频之一,线性代数的应用范围覆盖自然科学和社会科学的各个方面,尤其是在机器学习,深度学习领域处理高维问题时,向量空间和矩阵运算更是这些强大算法的理论基础和基本工具。李老师以幽默风趣的上课风格让很多晦涩难懂的线性代数的理论变得轻松易懂,并且老师会通过很多有趣的例子其它学科的理论在课堂上展现出来,并且逐步推导深奥的理论知识,对于想要学习线性代数的人来说绝对是非常推荐的。在学习完李宏毅老师的《线性代数》课程后,学习李宏毅老师《机器学习》会更有自信。 3 | 4 | 李宏毅《机器学习》github地址:https://github.com/datawhalechina/leeml-notes 5 | 6 | ## 使用说明 7 | 这个笔记是根据李宏毅老师线性代数视频的一个辅助资料,本笔记基本上完全复刻李老师课堂上讲的所有内容,并加入了一些和相关的学习补充资料和参考资料,结合这些资料一起学习,相信你会对线性代数有更加深刻的理解,为以后学习李宏毅老师的机器学习课程打下了基础。 8 | 9 | 李宏毅《机器学习》:https://github.com/datawhalechina/leeml-notes 10 | 11 | 李宏毅《线性代数》:https://github.com/datawhalechina/leela-notes 12 | 13 | ## 笔记在线阅读地址 14 | 在线阅读地址:https://datawhalechina.github.io/leela-notes/#/ 15 | ## 课程在线观看地址 16 | bilibili:[李宏毅《线性代数》](https://www.bilibili.com/video/av64160249?from=search&seid=3301869198468514506) 17 | 18 | ## 主要贡献者(按首字母排名) 19 | - [@DatawhaleXiuyuan](https://github.com/DatawhaleXiuyuan) 20 | - [@spareribs](https://github.com/spareribs) 21 | 22 | 23 | 24 | # 关注我们 25 | 26 |
Datawhale,一个专注于AI领域的学习圈子。初衷是for the learner,和学习者一起成长。目前加入学习社群的人数已经数千人,组织了机器学习,深度学习,数据分析,数据挖掘,爬虫,编程,统计学,Mysql,数据竞赛等多个领域的内容学习,微信搜索公众号Datawhale可以加入我们。
27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | -------------------------------------------------------------------------------- /docs/_sidebar.md: -------------------------------------------------------------------------------- 1 | - 目录 2 | - [ P1学习目标](chapter1/chapter1.md) 3 | - [P2线性方程组](chapter2/chapter2.md) 4 | - [P3向量](chapter3/chapter3.md) 5 | - [P4矩阵](chapter4/chapter4.md) 6 | - [P5矩阵向量相乘](chapter5/chapter5.md) 7 | - [P6是否有解](chapter6/chapter6.md) 8 | - [P7有多少解](chapter7/chapter7.md) 9 | - [P8解线性方程组(1)](chapter8/chapter8.md) 10 | - [P9解线性方程组(2)](chapter9/chapter9.md) 11 | - [P10从行最简阶梯形能学到什么(1)](chapter10/chapter10.md) 12 | - [P11从行最简阶梯形能学到什么(2)](chapter11/chapter11.md) 13 | - [P12从行最简阶梯形能学到什么(3)](chapter12/chapter12.md) 14 | - [P13从行最简阶梯形能学到什么(3)](chapter13/chapter13.md) 15 | - [P14矩阵相乘](chapter14/chapter14.md) 16 | - [P15矩阵的逆](chapter15/chapter15.md) 17 | - [P16可逆性](chapter16/chapter16.md) 18 | - [P17怎么找一个矩阵的逆](chapter17/chapter17.md) 19 | - [P18子空间](chapter18/chapter18.md) 20 | - [P19基](chapter19/chapter19.md) 21 | - [P20行列空间与零空间](chapter20/chapter20.md) 22 | - [P21坐标系统](chapter21/chapter21.md) 23 | - [P22坐标系统中的线性变化](chapter22/chapter22.md) 24 | - [P23行列式的公式](chapter23/chapter23.md) 25 | - [P24行列式的值](chapter24/chapter24.md) 26 | - [P25特征值与特征向量](chapter25/chapter25.md) 27 | - [P26相似对角化](chapter26/chapter26.md) 28 | - [P27线性变化的相似对角化](chapter27/chapter27.md) 29 | - [P28网页排名](chapter28/chapter28.md) 30 | - [P29正交性](chapter29/chapter29.md) 31 | - [P30正交投影](chapter30/chapter30.md) 32 | - [P31施密特正交化](chapter31/chapter31.md) 33 | - [P32正交基](chapter32/chapter32.md) 34 | - [P33正交矩阵](chapter33/chapter33.md) 35 | - [P34对称矩阵](chapter34/chapter34.md) 36 | - [P35广义向量(1)](chapter35/chapter35.md) 37 | - [P36广义向量(2)](chapter36/chapter36.md) 38 | - [P37奇异值分解](chapter37/chapter37.md) 39 | 40 | 41 | 42 | 43 | -------------------------------------------------------------------------------- /docs/index.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | LeeLA-Notes 6 | 7 | 8 | 9 | 10 | 11 | 12 |
13 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 52 | 53 | 54 | -------------------------------------------------------------------------------- /docs/chapter4/chapter4.md: -------------------------------------------------------------------------------- 1 | 2 | ## 矩阵 3 | ### 矩阵的定义 4 | 5 | ![](res/chapter4-1.png) 6 | 7 | ![](res/chapter4-2.png) 8 | 9 | 有一组vector放在一起就变为matrix,假设现在有3个vector$a_1,a_2,a_3$,将3个vector排在一起。这三个vector就变成A这个matrix的三个columns。 10 | 11 | 12 | 13 | $$A=[a_1,a_2,a_3]=\begin{bmatrix} 14 | 2 &3 &5 \\ 15 | 3 & 1 & -1\\ 16 | -2& 1& 1 17 | \end{bmatrix}$$ 18 | 19 | ### 矩阵的写法 20 | 21 | ![](res/chapter4-3.png) 22 | 23 | 接下来我们来定义下size的写法,如果一个matrix有m个rows n个columns,我们就说它的size是m x n。如果一个matrxi的m跟n是一样的就说它是square,我们会用**M**,下标m x n来代表所有m by n matrix所成的集合。 24 | 25 | 比如一个matrix有二个rows,3个columns,size描述为2 x 3($M_{2x3}$)。另外一个matrix有3个rows,2个columns,size描述为3 x 2($M_{3x2}$). 26 | 27 | 在matrix里面原则是:先row再column 28 | 29 | 30 | ![](res/chapter4-4.png) 31 | 32 | 33 | 如果今天要讲matrix第i个row,第j个column的成员时,那就用$(i,j)$来代表。假设有一个m by n 的matrix A(m个rows,n个columns),在写每一个component下标的时候,原则是先写row index,在写column index。 34 | 35 | 有一个3 x 3的matrix,3是第一个row,第二个column,所以它的index是(1,2)。左下角-2是第三个row,第一个column,所以它的index是(3,1) 36 | 37 | ## 矩阵的运算 38 | 39 | ![](res/chapter4-5.png) 40 | 41 | matrix有一模一样的size时,就可以将matrix相加或者相减,也可以给matrix乘以scalar。假设现在有两个matrix **A,B**,想给matrix B乘以9,就是将9乘以B的每一个component。A+B就是将同一位置的element相加,A-B就是将同一位置的element相减。 42 | 43 | ## 特殊矩阵 44 | 45 | ![](res/chapter4-6.png) 46 | 47 | 48 | 定义一些特别的matrix,Zero matrix就是每一个element都是O的matrix,就时候会有0来表示,可以是任何的size,也可以给O加上下标变为$O_{m\times n}$。$O_{2\times 3}$就是说:有一个matrix里面都是0,它的row是,column是3。任何的matrix加上Zero matrix都还是自己,任何的matrix乘以Zero matrix都还是Zero matrix,任何matrix减去自己本身的matrix是Zero matrix。 49 | 50 | Identity matrix必须是正方形的,Identity matrix的特性是:左上到右下的对角线是1,其它部分都是0。我们通常会用大写I来表示Identity matrix,加上一个下标来代表size($I_3$代表是一个3 by 3的Identity matrix)。我们不需要用两个下标来表示它。 51 | 52 | ## 矩阵的性质 53 | 54 | ![](res/chapter4-7.png) 55 | 56 | matrix有很多的特性,假设有A,B,C三个matrix,dimension都是m by n,s和t都是scalars 57 | 58 | ## 转置 59 | 60 | ![](res/chapter4-8.png) 61 | 62 | 63 | 假设A是m by n的matrix,当给A加上一个上标T的意思是:$A^T$也是一个matrix,叫做transpose of A。A原来使是m by n的matrix,$A^T$是n by m的matrix。 64 | 65 | 66 | 假设有一个matrix A,然后给matrix A做transpose,做法是: 以左上到右下的对角线为轴进行翻转。原来在matrix A中的9在第一个row第二个column,做transpose时row跟column交换,变为第二个row第一个column。原来matrix A中的2是第三个row第二个column,做transpose以后就是第二个row第三个column。 67 | 68 | ![](res/chapter4-9.png) 69 | 70 | 假设A跟B都是m by n的matrices,s是一个scalar。如果我把matrix A做一次transpose,然后把A的transpose再做一次transpose之后还是matrix A($(A^T)^T=A$),s乘以A再做transpose等于s乘以A的transpose($(sA)^T=sA^T$。A+B的transpose等于A的transpose加上B的transpose($(A+B)^T=A^T+B^T$)。 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | -------------------------------------------------------------------------------- /docs/chapter3/chapter3.md: -------------------------------------------------------------------------------- 1 | ## 高中版本的向量 2 | 3 | ![](res/chapter3-1.png) 4 | 5 | 6 | 本节投影片要讲什么是向量,已经学过的高中版本向量,之后在第六章会告诉什么是真正的向量。 7 | 8 | ![](res/chapter3-2.png) 9 | 10 | ### 列向量 11 | 12 | 所谓的高中版本向量是:一组数字的集合用notation **v**来表示(粗体字表示向量)。举例来说:数字1,2,3集合起来,认为是向量。不一定是垂直方向(由上至下),也可以是水平的(由左至右),由上至下叫做column vector,由左往右叫做row vector,在这门课里面,我们通常提到的向量都是认为是column vector而不是row vector。 13 | 14 | ### 向量的组成 15 | 16 | ![](res/chapter3-3.png) 17 | 18 | 19 | 在向量里面有很多的components,如果今天要提到向量**v**里面第i个成员的时候,就会用$v_i$来表示。右边这个向量有三个成员(1,2,3),第一个成员$v_1=1$,第二个成员$v_2=2$,第三个成员$v_3=3$。 20 | 21 | 如果一个向量的components小于4(小于四个数字组成),可以在visualize(在二维空间里visualize二维的向量,可以在三维空间里visualize三维的向量)。通常得到做法是:一个向量里面有二个component,如果是在二维空间里面,将第一个component当做x坐标的值,第二个component当做y坐标的值,然后画一个箭头从原点指到($v_1,v_2$)的地方(两个component所决定点)。如果是三维也是一样(维度这个概念在第四章会详细的讲述什么是维度,我相信在高中会提到这个词汇,所以直接讲这是一个三维的向量,你也不会觉得困惑),三维的向量有三个element,如果要将这个向量可视化,三个element分别代表x轴坐标,y轴的坐标,z轴的坐标。这三个坐标可以在三维空间中定义出一个点,然后可以画出一个箭头从原点指向($v_1,v_2,v_3$)所决定的点。 22 | 23 | ## 向量运算 24 | ### 标量相乘 25 | 26 | ![](res/chapter3-4.png) 27 | 28 | 向量**v**的component是$v_1,v_2$,将在向量**v**乘以数值c意思是:将向量里面的每一个component都乘以c。向量v第一个component$v_1$,第二个component$v_2$。如果你将这个向量v乘以2倍,就是将$v_1$乘以2倍,$v_2$乘以2倍,就得到了一个新的向量。2****v****可以想象是将原来的向量**v**在原来的方向上变成2倍。 29 | 30 | ### 向量相加 31 | 32 | ![](res/chapter3-5.png) 33 | 34 | 向量也是可以进行相加,如果你有向量a,两个component分别是$a_1,a_2$,向量b的两个component分别是$b_1,b_2$。向量a加向量b:就是将两个向量同一个维度的component分别加起来($a_1+b_1, a_2+b_2$)。你也可以很单纯的想象是将a这个向量拿出来,将b这个向量的尾部接在a向量的头部就可以得到向量$a+b$。 35 | 36 | 37 | ## 向量集合 38 | 39 | ![](res/chapter3-6.png) 40 | 41 | 42 | 将很多的向量集合在一起得到vector set(四个三维向量),在这门课里面,当我们说到向量集合的时候,可以是有无穷多个成员。举例来说:我们有一个二维向量的集合,二维向量的component记为$x_1,x_2$,这个集合里面的成员($x_1,x_2$)都满足这个特性($x_1+x_2=1$)。$\begin{bmatrix} 43 | 0\\ 44 | 1 45 | \end{bmatrix}$`,`$\begin{bmatrix} 46 | 1\\ 47 | 0 48 | \end{bmatrix}$`,`$\begin{bmatrix} 49 | 0.3\\ 50 | 0.7 51 | \end{bmatrix}$`,`$\begin{bmatrix} 52 | 0.7\\ 53 | 0.3 54 | \end{bmatrix}$都是集合里面的成员。如果我们将这个集合画出来,如图所示其中红色这条线是$x_1+x_2=1$。高中学画向量时,就是从原点开始画一个箭头,所有有箭头的地方落在红色这条向量上就是这个集合里面的成员。箭头落在红色的向量其实有无穷多个,所以这个集合有无穷多个成员。 55 | 56 | 57 | ![](res/chapter3-7.png) 58 | 59 | 如果我们把所有n entries的向量通通集合起来,或者是将n维的向量集合起来,这个集合起来的向量被称为$R^n$(代表所以n维向量的集合) 60 | 61 | 如图这个vector通通属于$R^2$这个集合(二维向量),$R^2$这个集合其实涵盖了整个二维的空间,二维空间中所有的点都是属于$R^2$这个集合的成员。 62 | 63 | ## 向量的性质 64 | 65 | ![](res/chapter3-8.png) 66 | 67 | 向量有以下的特性,这些特性都是非常的truth feel,接下来也不会做任何的证明。假设现在有向量**u, v, w**是$R^n$的成员,也就是说**u, v, w**都是n维的向量,我们有任意三维的向量跟任意两个数值a, b。 68 | 69 | **u+v**一定等于**v+u**,可以先将**u**跟**v**加起来,再加**w**等于将**v**跟**w**加起来,在加**u**。如果在$R^n$里面存在一个很特别的向量,用0来表示,这个特别的向量能力是:将0这个向量加上任何的向量u都会变回本身。这个特别的向量,每个element都是0,所以在这个向量加上任何一个向量都会是其他它量的本身。因为这个向量里面每一个成员都是0,所以称之为zero vector。 70 | 71 | 存在一个向量**u'**,这个**u'** 加上 **u**变为zero vector,对于每个**u**而言,可以找到对手**u'**,**u'** 叫做 additive inverse。 72 | 73 | 74 | a跟b的scalar先相乘,在乘以**u**这个向量等于b先乘以向量**u**,再乘以a。 75 | 76 | 77 | 在这张投影片里面,想要告诉你向量有以下这些特性,但事实上真正的向量定义是反过来的,并不是说向量有这些特性,而是说有这些特性的东西就叫做向量。 78 | 79 | ![](res/chapter3-9.png) 80 | 81 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # 李宏毅线性代数笔记(Leela-Notes) 2 | 李宏毅老师的[线性代数视频](https://www.bilibili.com/video/av64160249?from=search&seid=3301869198468514506)是线性代数领域经典的中文视频之一,线性代数的应用范围覆盖自然科学和社会科学的各个方面,尤其是在机器学习,深度学习领域处理高维问题时,向量空间和矩阵运算更是这些强大算法的理论基础和基本工具。李老师以幽默风趣的上课风格让很多晦涩难懂的线性代数的理论变得轻松易懂,并且老师会通过很多有趣的例子其它学科的理论在课堂上展现出来,并且逐步推导深奥的理论知识,对于想要学习线性代数的人来说绝对是非常推荐的。在学习完李宏毅老师的《线性代数》课程后,学习李宏毅老师《机器学习》会更有自信。 3 | 4 | 李宏毅《机器学习》github地址:https://github.com/datawhalechina/leeml-notes 5 | 6 | ## 使用说明 7 | 这个笔记是根据李宏毅老师线性代数视频的一个辅助资料,本笔记基本上完全复刻李老师课堂上讲的所有内容,并加入了一些和相关的学习补充资料和参考资料,结合这些资料一起学习,相信你会对线性代数有更加深刻的理解,为以后学习李宏毅老师的机器学习课程打下了基础。 8 | 9 | 李宏毅《机器学习》:https://github.com/datawhalechina/leeml-notes 10 | 11 | 李宏毅《线性代数》:https://github.com/datawhalechina/leela-notes 12 | 13 | ## 笔记在线阅读地址 14 | 在线阅读地址:https://datawhalechina.github.io/leela-notes/#/ 15 | ## 课程在线观看地址 16 | bilibili:[李宏毅《线性代数》](https://www.bilibili.com/video/av64160249?from=search&seid=3301869198468514506) 17 | 18 | # 目录 19 | - [ P1学习目标](https://datawhalechina.github.io/leela-notes/#/chapter1/chapter1) 20 | - [P2线性方程组](https://datawhalechina.github.io/leela-notes/#/chapter2/chapter2) 21 | - [P3向量](https://datawhalechina.github.io/leela-notes/#/chapter3/chapter3) 22 | - [P4矩阵](https://datawhalechina.github.io/leela-notes/#/chapter4/chapter4) 23 | - [P5矩阵向量相乘](https://datawhalechina.github.io/leela-notes/#/chapter5/chapter5) 24 | - [P6是否有解](https://datawhalechina.github.io/leela-notes/#/chapter6/chapter6) 25 | - [P7有多少解](https://datawhalechina.github.io/leela-notes/#/chapter7/chapter7) 26 | - [P8解线性方程组(1)](https://datawhalechina.github.io/leela-notes/#/chapter8/chapter8) 27 | - [P9解线性方程组(2)](chapter9/chapter9.md) 28 | - [P10从行最简阶梯形能学到什么(1)](chapter10/chapter10.md) 29 | - [P11从行最简阶梯形能学到什么(2)](chapter11/chapter11.md) 30 | - [P12从行最简阶梯形能学到什么(3)](chapter12/chapter12.md) 31 | - [P13从行最简阶梯形能学到什么(3)](chapter13/chapter13.md) 32 | - [P14矩阵相乘](chapter14/chapter14.md) 33 | - [P15矩阵的逆](chapter15/chapter15.md) 34 | - [P16可逆性](chapter16/chapter16.md) 35 | - [P17怎么找一个矩阵的逆](chapter17/chapter17.md) 36 | - [P18子空间](chapter18/chapter18.md) 37 | - [P19基](chapter19/chapter19.md) 38 | - [P20行列空间与零空间](chapter20/chapter20.md) 39 | - [P21坐标系统](chapter21/chapter21.md) 40 | - [P22坐标系统中的线性变化](chapter22/chapter22.md) 41 | - [P23行列式的公式](chapter23/chapter23.md) 42 | - [P24行列式的值](chapter24/chapter24.md) 43 | - [P25特征值与特征向量](chapter25/chapter25.md) 44 | - [P26相似对角化](chapter26/chapter26.md) 45 | - [P27线性变化的相似对角化](chapter27/chapter27.md) 46 | - [P28网页排名](chapter28/chapter28.md) 47 | - [P29正交性](chapter29/chapter29.md) 48 | - [P30正交投影](chapter30/chapter30.md) 49 | - [P31施密特正交化](chapter31/chapter31.md) 50 | - [P32正交基](chapter32/chapter32.md) 51 | - [P33正交矩阵](chapter33/chapter33.md) 52 | - [P34对称矩阵](chapter34/chapter34.md) 53 | - [P35广义向量(1)](chapter35/chapter35.md) 54 | - [P36广义向量(2)](chapter36/chapter36.md) 55 | - [P37奇异值分解](chapter37/chapter37.md) 56 | 57 | ## 主要贡献者(按首字母排名) 58 | - [@DatawhaleXiuyuan](https://github.com/DatawhaleXiuyuan) 59 | - [@spareribs](https://github.com/spareribs) 60 | 61 | 62 | # 关注我们 63 | 64 |
Datawhale,一个专注于AI领域的学习圈子。初衷是for the learner,和学习者一起成长。目前加入学习社群的人数已经数千人,组织了机器学习,深度学习,数据分析,数据挖掘,爬虫,编程,统计学,Mysql,数据竞赛等多个领域的内容学习,微信搜索公众号Datawhale可以加入我们。
65 | 66 | -------------------------------------------------------------------------------- /docs/chapter5/chapter5.md: -------------------------------------------------------------------------------- 1 | 2 | ## 矩阵和向量乘积 3 | ![](res/chapter5-1.png) 4 | 5 | 在高中是定义过matrix跟vector相乘,这节课将会再讲一遍,看看这节课的内容跟你高中学的有什么样的不同。 6 | 7 | 8 | ![](res/chapter5-2.png) 9 | 10 | matrix A的dimension是m by n(m个rows,n个columns),vector x有n个component。我们在高中都学过matrix A乘以vector x,x的dimension一定要是n by 1,如果是其它dimension的话就不match。 11 | 12 | matrix A乘以vector x的定义是:把x第一个element($x_1$)乘以$a_{11}$加上第二个element($x_2$)乘以($a_{12}$),一直到第n个element乘以($a_{1n}$)相加所得的结果放在第一位。x第一个element($x_1$)乘以$a_{21}$加上第二个element($x_2$)乘以($a_{22}$),一直到第n个element乘以($a_{2n}$)相加所得的结果放在第二位,以此类推。 13 | 14 | 15 | ![](res/chapter5-3.png) 16 | 17 | 上述的这个式子其实就是多元一次联立方程式,所以多元一次联立方程我们把它未知数的部分($x_1,x_2,...x_n$)排成一个vector,把它的coefficients($a_{11},a_{12},...,a_{mn}$)排成matrix。多元联立方程式的左边其实就是一个vector x跟matrix A的相乘,右边的$b_1,b_2,...,b_m$可以看做是vector b。所以一次多元联立方程式可以用矩阵跟向量的乘积来描述($Ax=b$)。 18 | 19 | 如果我们用线性系统来解释时,所谓Ax是:线性系统的coefficients描述线性系统的参数(m by n element),当把一个输入**x**($x_1,x_2,...x_n$)丢进线性系统里面时,输出就是Ax。 20 | 21 | matrix A跟vector x相乘这件事有两个不同的看法。 22 | 23 | ### 行的角度 24 | 25 | ![](res/chapter5-4.png) 26 | 27 | 第一个是row aspect,Ax表示把vector x 对A的每一个row做内积得到scalar($a_{11}x_1+a_{12}x_2+...+a_{1n}x_n$),这个scalar是Ax的第一个component。 28 | 29 | ![](res/chapter5-5.png) 30 | 31 | 得到scalar是Ax的第二个component 32 | 33 | 34 | ![](res/chapter5-6.png) 35 | 36 | 得到scalar是Ax最后一个component 37 | 38 | 39 | 可以用vector跟row做inner product这样的方式来理解vector跟matrix的相乘。 40 | 41 | ### 列的角度 42 | 43 | ![](res/chapter5-7.png) 44 | 45 | 另外一个理解的方式是从column的角度来理解,可能多数人习惯从row的角度来理解,但是有些问题从column的角度来理解比较容易。 46 | 47 | 将A的每一个column看做是一个vector,总共有n个vector。column的看法是:将第一个vector乘以$x_1$这个scalar,将第二个component$x_2$乘以A的第二个vector,一直到最后一个向量乘以$x_n$。每一个column乘以scalar做了变化以后,将这些变化后的column相加起来,就是matrix跟vector相乘。 48 | 49 | 50 | ![](res/chapter5-8.png) 51 | 52 | 举一个例子来说明row aspect跟column aspect。如图联立的方程式看做是系统,系统的参数可以看做是$A=\begin{bmatrix} 53 | 1 &4 \\ 54 | -3 &2 55 | \end{bmatrix}$,系统的输入是$x_1,x_2$,输出是$b_1,b_2$,$b_1,b_2$等于Ax。如果将$x_1,x_2$拼起来变为二维vector,将这个二维vector作为linear system输入的时候,它的output就是这个vector跟A相乘。 56 | 57 | 58 | 从row的观点和column的观点来看会有什么样的不同呢?假设输入(-2, 0.5),如果从row的观点来观察:将$\begin{bmatrix} 59 | 1\\ 60 | 4 61 | \end{bmatrix},\begin{bmatrix} 62 | -3\\ 63 | 2 64 | \end{bmatrix}$描绘出来,现在有一个vector是$\begin{bmatrix} 65 | -2\\ 66 | 0.5 67 | \end{bmatrix}$,得到第一个output$b_1$是:红色的箭头跟row1的inner product,第一个output等于0。第二个output$b_2$是:红色的箭头跟row2的inner product,第二个output等于7。 68 | 69 | 从column的观点来看,系统有两个column$\begin{bmatrix} 70 | 1\\ 71 | -3 72 | \end{bmatrix},\begin{bmatrix} 73 | 4\\ 74 | 2 75 | \end{bmatrix}$,当输入是$\begin{bmatrix} 76 | -2\\ 77 | 0.5 78 | \end{bmatrix}$时,将-2乘以第一个column做变化变为绿色虚线的箭头($\begin{bmatrix} 79 | -2\\ 80 | 6 81 | \end{bmatrix}$),然后将0.5乘以第二个column做变化变为$\begin{bmatrix} 82 | 2\\ 83 | 1 84 | \end{bmatrix}$。从column的角度来看,当input$\begin{bmatrix} 85 | -2\\ 86 | 0.5 87 | \end{bmatrix}$时,就是将两个column各自做伸缩,然后相加。 88 | 89 | row aspect跟column aspect的结果是一样的,这就是“横看成岭侧成峰”的概念,它们的观点是不一样的,但是得到的结果是一样的。 90 | 91 | 92 | ![](res/chapter5-9-3.png) 93 | 94 | matrix-vector相乘的size是要match的。图中的A跟x是不能进行相乘,从row的观点来观察,dimension不一样就不做inner product,从column观点来看也是不能相乘。从column的观点进行观察:将matrix A中的三个vector做combination然后相加,需要三个数值才能将三个column做combine,二个数值是行不通的。 95 | 96 | column的数目必须跟vector dimension的数目相同才行,A'跟x才能做inner product 97 | 98 | ## 矩阵和向量相乘的性质 99 | 100 | ![](res/chapter5-10.png) 101 | 102 | 这张投影片是一些Matrix跟vector做inner product的一些性质。 103 | 104 | 105 | 106 | ![](res/chapter5-11.png) 107 | 108 | 109 | 假设A跟B都是m by n的matrix,对于任何一个vector w和A, B相乘,$Aw$等于$Bw$。那么A和B相等吗?(答案是相等/成立的) 110 | 111 | 112 | 如果A乘以standard vector($e_j$)得到的结果为$a_j$,$e_1$的第一个dimension是1其它dimension为0。matrix A乘以$e_1$,(matrix A有n个columns,每一个column可以想象为一个vector)。 113 | 114 | 从column的角度来观察:将所有的column做combine,得到的结果为$a_1$,其它dimension也可以此类推。matrix A乘以$e_j$(第j维为1的vector)得到的结果是matrix第j个column。 115 | 116 | $Aw$等于$Bw$,将w换为$e_1$,得到$Ae_1=Be_1$。$Ae_1$等于$a_1$(A的第一个column),$Be_1$等于$b_1$(B的第一个column),这就意味着A跟B的第一个column是相同的。$Ae_2$等于$a_2$(A的第二个column),$Be_2$等于$b_2$(B的第二个column),这就意味着A跟B的第二个column是相同的,一直到$Ae_n=Be_n$。所以A=B 117 | 118 | 119 | 120 | 这件事有趣的地方是:两个matrix(linear system)是不是相同,不需要将所有的输入都代入观察输出。其实只要代入n个standard vector,它们的输出是相同的,那么这两个系统就是相同的。 121 | 122 | 123 | ![](res/chapter5-12.png) 124 | 125 | 126 | 127 | 128 | 129 | 130 | 131 | 132 | 133 | 134 | 135 | 136 | 137 | 138 | 139 | 140 | 141 | 142 | 143 | 144 | 145 | -------------------------------------------------------------------------------- /docs/chapter2/chapter2.md: -------------------------------------------------------------------------------- 1 | 2 | ![](res/chapter2-1.png) 3 | 4 | 上节课讲了线性代数就是在分析线性的系统,你可能觉得线性的系统听起来很陌生,也许你过去没有听过线性系统。但是Linear System有另外的表示的方式,这个表示的方式你一定知道,英文叫做System of Linear Equations。 5 | 6 | ## 多元一次联立方程式 7 | 8 | ![](res/chapter2-2.png) 9 | 10 | 11 | 我们先来讲什么是System of Linear Equations,英文表示你可能一下子想不起来是什么,它的中文是“多元一次联立方程式”。 12 | 13 | 你可能常常需要解这个问题:有一个方程式是`$2x_1+3x_2+5x_3=5$`,其中`$x_1,x_2,x_3$`叫做variables,`$2, 3, 5$`叫做coefficients,等号右边的5是一个constant term。像这样`$2x_1+3x_2+5x_3=5$`的多元一次方程式之前是学习过的。那我们如果有一大堆的多元一次方程式,我们就得到了多元一次联立式接下来的问题就是,给你这些多元一次联立方程式(System of linear equations),你是否能够把`$x_1,x_2,x_3$`找出来,我相信你一定知道咋样做。 14 | 15 | 16 | ![](res/chapter2-3.png) 17 | 18 | 19 | 20 | 我们可以把刚才多元一次联立方程式(System of linear equations)写的更加的general一点,更通用的写法。这个写法里面有n个variables(`$x_1,...x_n$`),总共有m个equations,每一个equations都有一个constant term(`$b_1,b_2,...b_m$`)。在m个equations里面每一个variables都有一个coefficient。 21 | 22 | 如果在第一个equation里面的第一个variable(`$x_1$`)的coefficient写做`$a_{11}$`,第一个equation里面的第二个variable(`$x_2$`)的coefficient写做`$a_{12}$`,依次类推,System of linear equations的General形式可以写成上述表达的形式。在这门课里面,m跟n的数目是很大,之前在学习的时候m,n通常都是3。在这门课里面我们要讨论的是m跟n比较大的时候应该咋样处理。 23 | 24 | 25 | ## 术语 26 | ### 定义域 27 | ### 值域 28 | 29 | ![](res/chapter2-4.png) 30 | 31 | 32 | 在讲System of linear equations之前,我们先来讲Terminology。如果我们有一个function f ,function f会有一个定义域(Domain),定义域的意思是说:所以可以当做这个f input的集合。function f的输出都会落在对应域(Co-domain):function 33 | f可能出现的output所成的集合。值域(Range)意思是说:function 真正可以输出结果的集合。 34 | 35 | 36 | 同学们比较困惑点是:Co-domain跟Range有什么样的不同,所先值域比对应域还要小,值域一定是被对应域包含在里面。 37 | 38 | 39 | 举例来说:有一个很简单的function($y=x^2$),它的定义域是R(real number),对应域是:你不假思索的觉得这个output是什么,不要分析这个function的内涵。$y=x^2$的输入是实数(real number),输出y显然也一定是实数(real number)。但是你仔细思考了这个function的内涵以后,会发现$y=x^2$中y一定大于等于0,那么值域其实只包含了0跟所有的正数,所以值域是比所有的real number还要小(positive real number, 0)。 40 | 41 | 42 | ### 一对一 43 | ### 映成 44 | ![](res/chapter2-5.png) 45 | 46 | 接下来还要讲两个名词:一个是“one-to-one”(一对一),一个是Onto(映成)。 47 | 48 | one-to-one(一对一)的意思是:现在所有在domain里面的variable丢到function里面,得到一个在Co-domain里面的输出,这些输出都是不一样的。也就是说没有两个input有一样的output。 49 | 50 | 51 | 另外一个名词叫做Onto(映成),当你的range等于Co-domain时叫做Onto。你先不假思索的定义出了Co-domain以后,发现function的output其实是涵盖整个Co-domain,Co-domain里面的每一个 52 | ==()==function都可以output出来,这个时候你的Co-domain等于range。 53 | 54 | 不管是在one-to-one还是Onto,range一定比Co-domain还要小。在one-to-one的时候,我们不知道Co-domain跟range一样大还是小,但是我们知道domain跟range是一样大。一样大是一个很模糊的概念,之后我们会讲的更清楚两个Set一样大是什么意思。因为domain里面可能会有无穷多个element,这个range里面也有无穷多个element。但是domain跟range之间的关系是一对一的,每一个在domain里面的element都有一个在range里的element,显然domain跟range一样大。Onto这个function根据它的定义,Co-domain跟range一样大。 55 | 56 | ## 线性系统 57 | 58 | ![](res/chapter2-6.png) 59 | 60 | 61 | Linear System有两个properties,第一个properties:一个function有一个input x 一个output y,一个linear System的意思是:我们如果把输入乘以k倍,输出也会乘以k倍。第二个properties:如果我们输入$x_1$输出$y_1$,输入$x_2$输出$y_2$,那输出$x_1+x_2$输出是$y_1+y_2$ 62 | 63 | ![](res/chapter2-7.png) 64 | 65 | 66 | 在讲Linear System之前,为了要知道大家是不是真的了解linear Systme的概念,因为很多人对Linear System的想象就是有一个条直线是斜的就叫做linear System,其实不是这样的。现在举两个function看是不是linear Systme。 67 | 68 | 69 | 输入一个function f,Derivative可以吃另外一个function当做input(Derivative是一个function),会输出另外一个function f'(输入function f的微分)。会做微分(Derivative)的function是不是linear System,接下来举一个更加具体的例子。假设输入的function是$x^2$,输出的function是输入function做微分的结果$2x$。输入是一个曲线,输出是一条直线,显然要把曲线压成直线显然是一个很复杂的process,但它是linear System(微分感觉很复杂,但它确实是线性系统)。 70 | 71 | 有一个function的工作是做积分,然后吃一个function当做输入,会输出一个数值(把输入的function f从a积分到b得到一个数值,这个数值是这个会做积分function的输出)。假设输入是$x^2$,输出是把function f从a积分到b($\frac{1}{3}(b^3-a^3)$),积分这件事情也是也是线性的。 72 | 73 | ## 线性系统和线性方程组 74 | 75 | ![](res/chapter2-8.png) 76 | 77 | 假设有一个线性的系统(在这门课里面,通常分析线性系统的时候,假设输入是一个向量),这个linear System的输入是一个n维的向量,输出是一个m维向量。input domain是所有n维向量的集合,Co-domain是所有m维向量的集合。你可能会觉得这样输入向量会不会很局限呢?后面的课程会告诉你,事实上这个世界上很多事情都是向量,所以这个分析非常的general,可以apply非常多的地方。 78 | 79 | 80 | 这个linear system可以写成System of linear equations,反过来讲比较容易,如果你把一个System of linear equations中的variable(x)当做input,将b当做system的constant term(output),那这个System of linear equations显然是一个linear system。因为你只要把这些输入都乘以k倍,输出会显然乘以k倍,所以它符合线性系统的第一个特性。第二个特性是theorem,不需要证明。可以很容易看出一个System of linear equations是linear System。 81 | 82 | 但是反过来讲,一个linear system一定可以写成System of linear equations吗?这是下一个投影片要讲的内容。 83 | 84 | 85 | ### 线性方程组 86 | 87 | ![](res/chapter2-9.png) 88 | 89 | 这篇投影片要讲的是一个linear System一定可以写成System of linear equations,接下来假设你知道咋样分析System of linear equations,你就知道咋样分析linear system。接下来你要说明一个linear system为什么对应有一个System of linear equations。 90 | 91 | 现在有一个linear system(黑盒子),对linear system输入一个standard vector(1,0,...0),这个向量只有第一维是1其它维度都是0,它给出一组输出$a_{11},a_{21},...a_{n1}$。输入另外一组向量(0,1,...0),会产生$a_{12},a_{22},...,a_{m2}$,直到输入一组向量(0,0, ...1),产生$a_{1n},a_{2n},...,a_{mn}$。假设输入了n个standard vector以后,就可以得到n个输出。 92 | 93 | 根据线性系统的特性,如果你把输入乘以k倍,输出就会乘以k倍。所以现在给输入乘以$x_1$倍,输出也直接乘以$x_1$倍。本来输入(1,0,...0),输出$a_{11},a_{21},...,a_{m1}$,输入输出都乘以$x_1$后,输入变为$x_1,0,...0$,输出变为$a_{11}x_1,a_{21}x_1,...,a_{m1}x_1$,这是线性系统的第一个特性(input乘以k倍,output乘以k被)。如果将线性系统的输入通通加起来会等于输出加起来,也就是:($x_1,0,..0$),($x_2,0,..0$),一直到($0,0,...x_n$)加起来,得到的向量是($x_1,x_2,...x_n$),输出应该是原来没有相加式子的总和。假设输出第一维是$b_1$,第二维是$b_2$,第m维是$b_m$。我们又知道第一维的输出是($a_{11}x_1+a_{12}x_2+...+a_{1n}x_n$),等于$b_1$。第二维的输出是($a_{21}x_1+a_{22}x_2+...+a_{2n}x_n$,等于$b_2$。所以它其实是一个System of linear equations。 94 | 95 | 96 | 今天投影片学到的是:linear system一定可以System of linear equations,之后在分析System of linear equations时,就是在分析的是linear system。 -------------------------------------------------------------------------------- /docs/chapter7/chapter7.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | ## 有多少解 4 | ![](res/chapter7-1.png) 5 | 6 | 在上一节课我们讲的是检查一个linear ,接下来这堂课要讲是有解的情况下是有多少个解呢? 7 | 8 | ![](res/chapter7-2.png) 9 | 10 | 上节课讲了可以使用linear combination和Span的概念检查linear equation有没有解,这堂课要学习的内容是有解的前提下有多少个解呢? 11 | 12 | 13 | ![](res/chapter7-3.png) 14 | 15 | 16 | 假设A的column是independent,或者假设Rank A=n,或者说Nullity A=0,那么A就有唯一解;另外的case是:A的column是dependent,或者Rank A0,那么A就有无穷多解。 17 | 18 | 你可能会很疑惑为什么只考虑唯一解和无穷多解的case,怎么不考虑其它的解(比如有两个解,三个解)。一个linear equation有解的时候,它唯一的可能是一个解或者无穷多个解 19 | 20 | ## Dependent和Independent 21 | 22 | ![](res/chapter7-4.png) 23 | 24 | 25 | 接下来介绍两个专有名词dependent(依赖的,不独立的)和independent(独立的,自主的) 26 | 27 | ### 定义 28 | 29 | ![](res/chapter7-5.png) 30 | 31 | 32 | 如果有一个vector set($a_1,a_2,a_n$)是linear dependent就要满足下列的条件,如果能找到一组scalar不能全是0($x_1,x_2,...x_n$),然后将scalar和vector进行linear combination,然后得到的结果是zero vector,那我们就说这组vector是dependent。如果想要scalar和vector进行linear combination得到的结果为0,唯一的可能是coefficient都为0,那我们就说这组vector是independent。 33 | 34 | 如果可以找到一组scalar可以让linear combination等于0,就可以找到无穷多组 35 | 36 | ### 线性无关 37 | 38 | ![](res/chapter7-6.png) 39 | 40 | 41 | 假设有一组vector set,在这个vector set中存在某一个vector($a_i$)是其它vectors的linear combination,那我们就说这个vector set是dependent。 现在有一个vector set($\begin{bmatrix} 42 | -4\\ 43 | 12\\ 44 | 6 45 | \end{bmatrix}, \begin{bmatrix} 46 | -10\\ 47 | 30\\ 48 | 15 49 | \end{bmatrix}$),若将vector set里面的第二个vector乘以$\frac{2}{5}$就可以得到第一个vector,所以第一个vector是第二个vector的linear combination。所以这是一个dependent vector set。 50 | 51 | 52 | 53 | 假设现在有一组vector set($\begin{bmatrix} 54 | 6\\ 55 | 3\\ 56 | 3 57 | \end{bmatrix},\begin{bmatrix} 58 | 1\\ 59 | 8\\ 60 | 3 61 | \end{bmatrix},\begin{bmatrix} 62 | 7\\ 63 | 11\\ 64 | 6 65 | \end{bmatrix}$),第三个vector等于第一个vector加上第二个vector,所以第三个vector是前面两个vector的linear combination,所以这是一个dependent。 66 | 67 | 68 | ![](res/chapter7-7.png) 69 | 70 | 71 | 假设现在有一个vector set($\begin{bmatrix} 72 | 3\\ 73 | -1\\ 74 | 7 75 | \end{bmatrix}, \begin{bmatrix} 76 | 0\\ 77 | 0\\ 78 | 0 79 | \end{bmatrix},\begin{bmatrix} 80 | -2\\ 81 | 5\\ 82 | 1 83 | \end{bmatrix} $),第二个向量等于第一个向量和第三个向量乘以0,所以第二个向量是其它向量的linear combination,所以这个vector set是dependent 84 | 85 | 86 | 这个示例告诉我们,如果在一个vector set里面有zero vector,那这个vector ser就是dependent 87 | 88 | 89 | 90 | ![](res/chapter7-8.png) 91 | 92 | 93 | 如图的两个描述都是在讲一件事情,假设有一个vector set是dependent,根据dependent的定义就是找一组不全为0的系数做linear combination得到zero vector。将$2a_i+a_j+3a_k=0$,将$-3a_k$移到等号的右边得到$2a_i+a_j=-3a_k$,等号的两边同时除以-3得到$-\frac{2}{3}a_i+(-\frac{1}{3}a_j)=a_k$,换句话就是:$a_k$是$a_i$和$a_j$的linear combination。所以这个vector set是dependent 94 | 95 | 反过来说也是一样的,dependent的意思是$a_i'=3a_j'+4a_k'$(其中某一个vector是其他vector的linear combination),这句话的描述等同于将所有的数字移到等式的一边等于0。 96 | 97 | 所以上述两句话的描述是讲同一件事情。 98 | 99 | 100 | ![](res/chapter7-9.png) 101 | 102 | 假设有一个linear equation,coefficient matrix的column是dependent,若这个system linear equation有解,就是无穷多解。但不保证因为matrix的column是dependent,就是有解,dependent跟有没有解这件事情是没有关系的。 103 | 104 | 105 | 假设现在有system linear equation为$\begin{bmatrix} 106 | 6 &1 &7 \\ 107 | 3&8 &11 \\ 108 | 3 &3 &6 109 | \end{bmatrix}\begin{bmatrix} 110 | x_1\\ 111 | x_2\\ 112 | x_3 113 | \end{bmatrix}=\begin{bmatrix} 114 | 14\\ 115 | 22\\ 116 | 12 117 | \end{bmatrix}$,其中coefficient matrix为$\begin{bmatrix} 118 | 6 &1 &7 \\ 119 | 3 & 8 &11 \\ 120 | 3 & 3 &6 121 | \end{bmatrix}$,这个vector set中第三个vector 等于第一个vector和第二个vector相加,所以这个vector set是dependent,若这个linear equation一旦有解,就是有无穷多解。 122 | 123 | 124 | 125 | 126 | 127 | 128 | 为什么一旦有解,就是有无穷多解呢?先假设我们找得到一组解为$x_1=1,x_2=1,x_3=1$`(`$\begin{bmatrix} 129 | x_1\\ 130 | x_2\\ 131 | x_3 132 | \end{bmatrix}$),按照dependent的性质:多找到一组解,就是找到无穷多解。dependent的性质:其中的一个vector是其它vector的linear combination,我们可以用linear combination表示的vector换掉($1*\begin{bmatrix} 133 | 6\\ 134 | 3\\ 135 | 3 136 | \end{bmatrix}+1*\begin{bmatrix} 137 | 1\\ 138 | 8\\ 139 | 3 140 | \end{bmatrix}=\begin{bmatrix} 141 | 7\\ 142 | 11\\ 143 | 6 144 | \end{bmatrix}$,$2*\begin{bmatrix} 145 | 6\\ 146 | 3\\ 147 | 3 148 | \end{bmatrix}+2*\begin{bmatrix} 149 | 1\\ 150 | 8\\ 151 | 3 152 | \end{bmatrix}=\begin{bmatrix} 153 | 14\\ 154 | 22\\ 155 | 12 156 | \end{bmatrix}$),就等于找到另外一组解为$x_1=2, x_2=2, x_3=0$($\begin{bmatrix} 157 | x_1\\ 158 | x_2\\ 159 | x_3 160 | \end{bmatrix}= 161 | \begin{bmatrix} 162 | 2\\ 163 | 2\\ 164 | 0 165 | \end{bmatrix}$),那么就会有无穷多解。 166 | 167 | 168 | ## 证明 169 | 170 | ### 有解时会有无穷多解 171 | ![](res/chapter7-10.png) 172 | 173 | 174 | 接下来是要证明的是:column是dependent在有解时就会有无穷多解,若我们想要证明这两件事情是等价的。那我们证明的时候就要分为分为两个方向:A的columns是dependent,一旦有解就是有无穷多解(dependent——无穷多解;若linear equation一旦有解,那这个linear equation 的column就是dependent。 175 | 176 | ### homogeneous 177 | 178 | ![](res/chapter7-11.png) 179 | 180 | 若有一个system linear equation的constant term等于0(b=0),那我们就这个linear equation是homogeneous,homogeneous linear equation一定有解(一定有一组解为zero vector) 181 | 182 | 根据dependent和independent的定义我们马上可以得到以下的结论:若有一个vector set($a_1,a_2,a_n$)是dependent,用这组vector set来组成matrix A来产生一个homogeneous linear equation一定会有无穷多解。 183 | 184 | 假设有一组vector set($a_1,a_2,...a_n$)是independent,若要找一组参数$x_1,x_2,...x_n$乘以$a_1,a_2,...a_n$等于0,唯一的可能是:$x_1,x_2,...x_n$都为0(s定义) 185 | 186 | ### dependent——无穷多解 187 | 188 | ![](res/chapter7-12.png) 189 | 190 | 191 | 刚才只讲了homogeneous linear equation,现在要讲的是generous的情况。若matrix A一旦为dependent时,如果$Ax=b$有解,就一定会有无穷多解。 192 | 193 | 我们已经知道matrix A是dependent,根据dependent的定义:若homogeneous linear equation(Au=0),其中u可以为non-zero vector。我们又知道$Ax=b$有解,现在存在v,使得$Av=b$。 194 | 195 | 现在已经得到$Au=0, Av=b$,接下来可将两个式子左右进行相加得到$A(u+v)=b$,非零的u和原来已经存在解v相加后得到新的solution,那我们就会得到无穷多的解(dependent——无穷多解)。 196 | 197 | 若现在有无穷多的solution,那就一定能够找到两个不相同的solution(u, v),那么就会有$Au=b, Av=b$。接下来将这个式子进行相减得到$A(u-v)=0$,因为u不等于v,所以u-v不等于zero vector。这就是dependent的定义(能够找得到不全为零的coefficient跟matrix A的column进行linear combination后等于zero vector) 198 | 199 | ## Rank与Nullity 200 | 201 | ![](res/chapter7-13.png) 202 | 203 | 接下来要讲的内容是Rank与Nullity 204 | 205 | 206 | ### 计算Rank与Nulity的数目 207 | ![](res/chapter7-14.png) 208 | 209 | 在matrix的column里可以找到最多independent columns的数目就叫做matrix的rank,剩下的column数目就是Nullify(Number of columns - rank) 210 | 211 | ### 示例 212 | 213 | 假设有matrix$\begin{bmatrix} 214 | -3 &2 &-1 \\ 215 | 7& 9 &0 \\ 216 | 0 & 0 & 2 217 | \end{bmatrix}$,那个这个matrix的rank是多少呢?这个问题等同于从这个matrix set里面可以找到多少个vector放在一起是independent。经过观察这个matrix是independent的,rank等于3,nullity等于0 218 | 219 | 220 | 假设有matrix$\begin{bmatrix} 221 | 1 &3 &10 \\ 222 | 2& 6 &20 \\ 223 | 3 & 9 & 30 224 | \end{bmatrix}$,经过观察这个matrix的rank等于1,nullity等于2(多个vector时,按照某一个vector是其它vector的linear combination。只有一个vector时,按照dependent和independent定义)。 225 | 226 | 227 | 假设有matrix$\begin{bmatrix} 228 | 0 &0 &0 \\ 229 | 0& 0&0 \\ 230 | 0 & 0 & 0 231 | \end{bmatrix}$,经过观察这个matrix的rank等于0,nullity等于3 232 | 233 | 234 | ![](res/chapter7-15.png) 235 | 236 | 237 | 假设有matrix$\begin{bmatrix} 238 | 1 &3 &4 \\ 239 | 2 & 6 & 8 240 | \end{bmatrix}$,经过观察这个matrix的rank等于1,nullity等于2。 241 | 242 | 假设有matrix$ \begin{bmatrix} 243 | 0 &3 \\ 244 | 0&5 245 | \end{bmatrix}$,若rank=2时,因为这个matrix set里面有zero vector,所以是dependent。经过观察这个matrix的rank=1,nullity等于1($\begin{bmatrix} 246 | 3\\ 247 | 5 248 | \end{bmatrix}$)。 249 | 250 | 假设有matrix$\begin{bmatrix} 251 | 5\\ 252 | 2 253 | \end{bmatrix}$,经过观察这个matrix的rank等于1,nullity等于0。 254 | 255 | 假设matrix里面只有一个element$[6]$,经过观察这个matrix的rank等于1,nullity等于0。 256 | 257 | 258 | ![](res/chapter7-16.png) 259 | 260 | 261 | 假设matrix A是m by n的matrix(m个row,n个column),若它的rank是n(nullity为0)代表它的columns都为independent,若columns都为independent代表为只会有唯一解 262 | 263 | ## 判断唯一解还是无穷多解 264 | 265 | ![](res/chapter7-17.png) 266 | 267 | 268 | 我们这堂课学到的内容是:一个linear equation有解以后,进一步根据independent,根据rnk,根据nullity判断是唯一解还是无穷多解。 269 | 270 | ## 流程图的另一种画法 271 | 272 | ![](res/chapter7-18.png) 273 | 274 | 275 | 上述的流程图还可以这样来画,给定一个linear equation的coefficient 是matrix A,先判断是否为independent,或者检查rank等于n,或者检查nullity等于0。检查完以后不能够说:若matrix A为dependent就是有无穷多解,matrix A为independent为independent就是有唯一解。因为dependent还是independent跟有没有解是没有关系的,是有解的情况下判断是有唯一解还是无穷多解。 276 | 277 | 所以就算是知道column为dependent还是independent,你仍然不知道它到底有多少解。你还要进一步去确认到底是否有解,如果是在independent的情况下发现是没有解,那就是没有解;发现是有解的,那就是有唯一解。若不是在independent(dependent)的情况下,不满足如图的两个条件就是没有解,一旦有解就时有无穷多解。 278 | 279 | 280 | 281 | 282 | 283 | 284 | 285 | 286 | 287 | 288 | 289 | 290 | 291 | 292 | 293 | 294 | 295 | 296 | 297 | 298 | 299 | -------------------------------------------------------------------------------- /docs/chapter1/chapter1.md: -------------------------------------------------------------------------------- 1 | ![](res/chapter1-1.png) 2 | 3 | 讲一个线性代数是什么,在线性代数学问里面,我们预期学到什么样的知识,预期它可以解决什么样的问题 4 | 5 | ![](res/chapter1-2.png) 6 | 7 | 在这之前呢,我们要先提一个东西,这个东西叫做系统,叫做system,System这个东西其实在信号与系统里面你会更详细的学到相关的知识。 8 | 9 | 我们来稍微说下什么是系统,系统概念非常简单。你给它一个输入,它就会给你一个输出。你觉得用系统这个词觉得有点陌生的话,那你其实可以把系统又叫是一个function。如果讲function的话,你可能就觉得比较熟悉了。大家在高中的时候一定都学过函数(一定都学过function),它就是输入一个input,输出一个output。那随着情景的不同,Syetem有时候把它叫做transformation,有时候叫做operator,那指的都是一样的东西。 10 | 11 | 举例来说:语音辨识系统,它的输入是一段声音信号,而它的输出是这段声音信号的文字,我想大家这个都不陌生。在手机上,都有语音辨识的功能。 12 | 13 | 或者是对话系统(Siri, Alexa),对话系统做的事情是什么呢?它的输入是一句话,输出是系统的回应(Siri说:"How are you",Alexa就会说:"I am fine") 14 | 15 | 通讯系统指的是什么呢?如果今天你用手机跟你朋友聊天,你对着你的手机说:“Hello”,手机会把这个讯号传到基地台,然后在通过基地台,这个讯号会传到你朋友的手机那端,中间会有很冗长的process,直接用一个长箭头来表示。你在你手机那端说一个“Hello”,你朋友会听到“Hello”。那你可以把这整个讯号传输的过程(把你的“Hello”送过朋友手机那端)的当做是一个系统。这个系统的输入就是你说的话,系统的输出就是手机那端听到的声音。 16 | 17 | ## 线性系统的特征 18 | 19 | ![](res/chapter1-3.png) 20 | 21 | 在线性代数这门课里面我们要学的是什么呢?其实在线性代数这门课里面,我们要讨论的就是线性的系统,我们会对线性的系统做各式各样的了解和分析。 22 | 23 | 那什么是线性的系统呢?我们刚才已经讲了什么是系统(有一个输入,有一个输出)。线性的系统它的输入和输出之间有某些特定的关系,输入和输出应该要有下面这两个特征。 24 | 25 | 第一特征是Presevering Multiplication。假设现在有一个线性系统,输入是一个x,输出是一个y。你现在简单的将x想成是一个数值(number),y是另外的一个数值。有一个系统,输入是一个x,输出是一个y,如果它是一个线性系统的话,输入和输出之间要有这样的一个关系(如果把输出乘以k倍,输出也应该被乘以k倍),这是线性系统的第一个特征。 26 | 27 | 28 | 第二个特征使Persevering Addition,假设现在线性系统的输入是$x_1$,输出是$y_1$,输入是$x_2$,输出是$y_2$。那如果我们输入$x_1+x_2$,那它的输出是$y_1+y_2$ 29 | 30 | 31 | ![](res/chapter1-4.png) 32 | 33 | 并不是所有的系统都是线性的,如果这个系统不是的线性的话,就不在我们这门课的讨论范围之内。 34 | 35 | 举例来说:这个系统的输入是x,输出是$x^2$,那现在让我们确定下是否满足这两个线性系统的特征。 36 | 37 | 这个系统输入x,输出是$x^2$,当这个系统输入kx的时候,线性系统输出的是$kx^2$,但是按照这个系统本身的特性,它输出会是$k^2x^2$,所以这不是一个线性的系统 38 | 39 | 线性系统的第二个特征,它也没有满足。输入$x_1$,输出$x_1^2$,输入$x_2$,输出$x_2^2$。如果是一个线性的系统,你输入$x_1+x_2$,这时候会输出$x_1^2+x_2^2$。但是你这个系统你输入$x_1+x_2$,而输出是$(x_1+x_2)^2$,$(x_1+x_2)^2$并不等于$x_1^2+x_2^2$。 40 | 41 | 所以这个系统不是一个线性的系统,因为它并不满足刚才讲的线性系统的两个特征,如果是这样的系统,就不在我们这门课的讨论范围之内。 42 | 43 | ![](res/chapter1-5.png) 44 | 45 | 我们在举一个例子来说明:线性系统看起来是一个什么样子。我们刚才举的例子里面,线性系统就是输入一个x,它的输出就是一个y。 46 | 47 | 事实上这个系统可以输入更复杂的东西,之后会讲到说:输入甚至可以是一段连续的信号。如果要语音辨识系统的话,它输入是一段讯号,不仅仅是一个数值而已。 48 | 49 | 50 | 我们在举一个复杂的例子来说明线性系统是什么样子,现在假设我们有一个系统,输入是有两个数值($x_1,x_2$),输出也是两个数值(第一个输出是将两个输入相加起来,第二个输出是将两个输入相减)。这个系统有两个输入,两个输出。接下来的问题是:它是不是一个线性的系统,我们要check一下是否满足我们定义的线性系统的两个properties 51 | 52 | 第一个properties:输入$x_1,x_2$,输出$x_1+x_2,x_1-x_2$。将输入乘以k倍(输入$kx_1,k_2$),输入会变为$kx_1+kx_2,kx_1-kx_2$。输入乘以k倍,输出也会乘以k倍。所以说,线性系统的第一个特征是满足的。 53 | 54 | 55 | 线性系统的第二个特征,如果你输入$x_1,x_2$,输出$x_1+x_2,x_1-x_2$,输入$x_1',x_2'$,输出$x_1'+x_2',x_1'-x_2'$。那如果我们今天把输入加起来,输入$x_1+x_1',x_2+x_2'$,输出$(x_1+x_1')+(x_2+x_2'),(x_1+x_1')-(x_2+x_2')$ 56 | 57 | ## 应用 58 | 59 | ### 电路学 60 | 61 | ![](res/chapter1-6.png) 62 | 63 | 既然我们都已经讲了线性的系统,它有怎么样的应用呢。听起来好像是一件简单的东西,它可以被用在什么样的地方呢?第一个举得例子是在电路学这门课里面,所有的系统通通都是线性的系统。电路学这门课其实就是做了一件事,给你一个电路,这个电路上有一个输入,这个输入可能是一个电压源也可能是电流源(告诉你电压源有多少伏特,电流源有多少安培)。然后让你计算它的输出是多少(某个负载,某个支路的电压是多少) 64 | 65 | 66 | 67 | 其实整个电路就是一个系统,电压,电流源就是输入,负载上的电压和电流是输出。你可以check一下会发现说:整门电路里面我们的电路通通都是线性的,就算是有加电容,电感,事实上这样也是线性的。(电压源的电压乘以两倍,是否负载的电压和电流就会乘以两倍。你会发现这是符合线性系统的特征的) 68 | 69 | ### 信号与系统 70 | 71 | ![](res/chapter1-7.png) 72 | 73 | 第二个用用到线性代数的课是信号与系统,在信号与系统里面我们很多时候都会假设我们的系统是线性的。在信号与系统课中你会发现习题可能都会开头告诉你,我们现在现在讨论的是一个LTI(Linear Time Invariant System)的系统。 74 | 75 | 76 | 所以你学了线性代数,你再去学信号与系统,你会信号与系统的原理,会有比较深度的体悟。 77 | 78 | ![](res/chapter1-8.png) 79 | 80 | 81 | 线性代数会用在信号与系统的什么地方呢?在信号与系统里面你会学到Fourier Transorm(比较难),Fourier Transorm在信号与系统里面会反复不断的出现。 82 | 83 | 有一段信号(可能是声音讯号也有可能是其他的讯号),你会做一个神奇的transform叫做Fourier Transorm,把它变为frequency domain。 84 | 85 | Fourier Transorm其实也是Linear System,输入讯号,输出讯号。其实这两个讯号是同一个讯号,你只是在不同的观点来描述这个讯号(一个同学,我可以叫你的名字,也可以叫你的学号),不同的方式来称呼会有不同的好处。比如说:点名的时候,叫名字比叫学号更方便一点。time domain跟frequency domain对一个讯号来说,它就是同一个讯号的两个不同的名字。 86 | 87 | 通常在信号与系统这门课里面,给你time domain要转成frequency domain,这个准换的过程其实是一个线性的转换。 88 | 89 | 90 | 91 | Fourier Transorm非常的复杂,很多同学觉得他不是线性的,很多人想到线性的时候,觉得画一条直线就是线性的。但是你仔细想一下,即便是很复杂的function,按照我们这门课刚才讲的线性系统的那两个特征,那仍然是一个线性的系统。Fourier Transor可以用我们在线性代数这门课里面所学到的种种概念来加以理解, 92 | 93 | 94 | 95 | ![](res/chapter1-9.png) 96 | 97 | 用线性代数里面学到的东西,你可以做一个线性的系统,这个线性的系统可以做预测。在作业里面,有一个作业里面是:根据已经搜集到的资料,预测某一个观测站在下一个时间点PM2.5的数值 98 | 99 | ### 预测 100 | 101 | ![](res/chapter1-10.png) 102 | 103 | 建一个可以做气象预报的系统,系统的输入是$x_1,x_2,x_3$,输出记作y。y这个数值是:某年某月某日某事PM2.5的值,这个值是我们希望这个气象预报系统可以预测出来的PM2.5数值。$x_k$指的是我们要预测这个时间的前k个小时观测站的PM2.5的数值。这是一个系统,进一步我为了计算方便,假设它是一个线性的系统。$x_1,x_2,x_3$和y之间的关系为:$y=w_1x_1+w_2x_2+w_3x_3$。 104 | 105 | 你会发现这个系统符合刚才定义的线性系统的两个特性,如果知道$w_1,w_2,w_3$,你就可以把$x_1,x_2,x_3$输进去,它就可以输出y,你就可以做气象预报。 106 | 107 | 108 | 你可能会有问题:用一个线性的系统,可以准确的预测PM2.5吗?过去的PM2.5跟未来PM2.5之间的关系感觉会用很多不同事件的影响,感觉之间的关系非常的复杂,也许它们之间并不是一个线性的关系。但是你会发现说:就算你假设它们之间的关系是线性的,其实误差也不会有太大的偏差。 109 | 110 | 第二个问题是:这个$w_1, w_2, w_3$是咋样来的呢?我们肿么知道这个系统里面的参数长什么样子呢?我们肿么知道输入和输出之间要有咋样的关系?我们只知道它们之间的关系是线性的,$w_1,w_2,w_3$的数值是多少,这个咋样找出来的呢?你可以主观的设置(1, 1, 1.5),但这样的效果可能不会太好。实际上你要做的事情就是:让它自己学出来。这个就是“机器学习”的基本概念。 111 | 112 | 113 | ### Google的搜寻 114 | 115 | ![](res/chapter1-11.png) 116 | 117 | 线性代数还有很多的应用,举一个在日常生活中都会用到线性代数的应用——Google的搜寻。在Google搜寻的时候,你输入一个关键字,它会给你一排的搜寻的结果。这一排搜寻的结果, 假设你在这里的关键字是“台湾大学”,你搜寻出来的网页多数包含台湾大学。很多网页都有包含台湾大学,那应该咋样去排序呢?Google肿么知道把台湾大学首页排在整个搜寻列表的第一个呢?而不是排在最后一个。 118 | 119 | 这个可能要用到PageRank的技术,也就是有一个方法来计算每个网页的中重要性,PageRank代表了网页本身的重要性。 120 | 121 | ### PageRank 122 | 123 | ![](res/chapter1-12.png) 124 | 125 | 在这个示意图上面,每一个笑脸代表一个网页。笑脸和笑脸之间他们有箭头连接在一起,这些箭头指的是:网页和网页之间的超链接 126 | 127 | PageRank所做的基本假设是:假设现在有一堆人,在网路上浏览网页。他们平常逛网路的习惯是:假设这个人现在在看蓝色这个网页,它看完以后,想要跳转到下一个网页,跳转到哪一个网页,取决于这个蓝色的网页有哪些超连接,若这个蓝色的网页有3个超连接,分别连到红、黄、绿三个网页。这代表这个人看完这个网页以后。假设有$\frac{1}{3}$的几率去看红色,$\frac{1}{3}$的几率去看黄色的,$\frac{1}{3}$的几率去看绿色的。 128 | 129 | 130 | 假设有一堆人在网路上胡乱浏览网页,你就可以计算:在t个时间点,假设某个网页上的分布是某个样子,那下一个时间点,人数的分布会是什么样子。第t个时间点跟t+1个时间的人数分布它们中间其实是一个线性的关系。 131 | 132 | 同一个概念其实不是只能用在网页浏览上,比如说预测车流量,预测人口的流动。把这个网页换做城市,在每一个城市里面,每一个人想要搬去其它城市。如果你本来在蓝色的城市,有一定的几率想去红色的,有一定的几率想去黄色的,有一定的几率想去绿色的。你可以用同样的概率来预测人口的变化 133 | 134 | ![](res/chapter1-13.png) 135 | 136 | 假设有一堆人在网路上看来看去,然后根据PageRank的计算,可以计算出每一个网页平均的人数是多少。现在随机给一个分布,然后经过无穷远的时候以后。大家会到底集中在哪些网页上。(如果有一个网,被很多人link到,最后看这个网页的会特别多,或者某个网页被某个重要的网页到,那看它的人也会特别多。 ) 137 | 138 | ### Computer Graphics 139 | 140 | ![](res/chapter1-14.png) 141 | 142 | 线性代数还可以用在Computer Graphics上面,在Computer Graphics里面有时候你想要把一个物件进行旋转。假设有一个方形,左上角点的坐标为$\begin{bmatrix} 143 | x\\ 144 | y\\ 145 | z 146 | \end{bmatrix}$,我们想要把这个物件进行旋转,旋转以后同一个点的坐标就会变为$\begin{bmatrix} 147 | x'\\ 148 | y'\\ 149 | z' 150 | \end{bmatrix}$ 151 | 152 | 153 | 从 $\begin{bmatrix} 154 | x\\ 155 | y\\ 156 | z 157 | \end{bmatrix}$到$\begin{bmatrix} 158 | x'\\ 159 | y'\\ 160 | z' 161 | \end{bmatrix}$之间的关系你可以用一个线性系统来描述它。也就是说物件上的每一个点坐标跟转换后每一个点的坐标,它们之间的关系是线性的。你会发现线性代数会有非常非常多的应用 162 | 163 | 164 | ## 具体学习章节 165 | ### 解释 166 | 167 | ![](res/chapter1-15.png) 168 | 169 | 在这门课里面,我们会针对线性系统做哪些了解和分析,在chapter1-3里面我们做的就是解释。给你一个线性的系统,然后给你线性系统的输出(比如说:3,6,9) ,接下来问你这个线性系统要输入咋样的数值它可以得到我们想要的输出。在chapter1,chapter2,chapter3我们就是在问这件事情。可能有好几个不同的可能是要讨论,第一个就是到底有没有solution,也就是这个红色问号到底填不填的进去某种数值让这个线性系统的输出就是我们给定的结果,还是说你找不到任何可能的输入得到我们想要的输出。 170 | 171 | ### 这个解是唯一的吗 172 | 173 | 如果我们今天找得到某种输入可以得到想要的输出,接下来的问题就是这个解是唯一的吗?只有这一个输入可以得到我们想要的输出,还是其实有很多不同的输入都可以得到我们想要的输出。接下来第三个问题就是我们咋样找到这个解,既然我们确定有个解存在,那接下来我们会知道说:它可能是一个还是有多个,接下来是咋样找到这个解。另外我们会讲Determinants 174 | 175 | 其实在chapter1到chapter3里面,所讲的内容其实你在高中都学过,其实一个线性的系统就是一个多元一次的联立方程式。所以我们今天问你说:多元一次联立方程式到底有没有解,是不是唯一解,咋样来找它的解,其实你都会。只是在线性代数里面我们会用不同的观点来看待过去你已经都知道的事情。也就是说chapter1到chapter3里面的习题你完全可以会的,但是你用线性代数的观念来看的话,你会有不同的理解。 176 | 177 | 178 | 还有一个不一样就是高中只会讲3x3的行列式,但是在这门课里面会推广到nxn的行列式。通常你能够在高中里学到的线性系统,它的输入或输出是一堆数字,把这堆数字合起来叫做一个向量。其实向量这个概念你在高中就已经学过了,但是在这门课里面,我们会进一步的告诉你说其实线性系统的输入不一定是你所想象的那种向量,你所想象的那种向量就是一个点或者是二维(三维)空间中的一个箭头,这些都是在高中学过的东西。如果今天在这门课我告诉你说:vector并不仅局限于你想象的这个样子,举例来说一个function,一个讯号,一个会随着时间变化的讯号 179 | ,你也可以想象成一个vector。它不是你高中学过的vector,它才是真正的vector。它可以当做线性系统的输入,也可以是线性系统的输出,这是我们之后才会学到的。 180 | 181 | 所以很多的概念是你高中学到的,但是这门课里面还是会讲很多不一样的东西。 182 | 183 | 184 | ![](res/chapter1-16.png) 185 | 186 | 在第三章里面我只是给你一个指定的输出,要你找出什么样的输入可以得到这个输出。从第四章开始我们会讨论更加General的问题,其实线性系统输入其实是一个集合(各式各样的输入),把集合中的每一个element都输入进去就得到了另外一个集合。在第四章里面,我要问的是咋样来描述这个集合,我们不是用集合这个词汇来描述输出所有的可能性的总和,我们会用另外一个词汇来描述它。在线性代数里面有另外一个词汇来描述线性系统所有输出的总和,那就会告诉大家维度的概念,维度这个概念其实也是学习过的。但是在线性代数这门课里面,你会清楚的认识到我们讲维度的时候到底举的是什么。 187 | 188 | ### 咋样来描述这个集合 189 | 190 | 另外一个要告诉大家的是,既然我们能够描述这个集合,描述的方法其实不止一种的,可以选择比较容易的方式来描述这个集合。选择比较容易的方式来描述这个集合的时候,线性系统就会改变,变成一个可能是比较简单的线性系统,那你来计算或者分析它就是变得比较方便。 191 | 192 | 在第七章我们会学到,假设我们指定的输出其实不在这个输出范围的集合里面,那线性系统就不可能得到这个输出。但是我们咋样在这个可能的集合里面,找到一个element,跟我们指定输出的距离是最近的 193 | 194 | ### Eigen value 195 | 196 | ![](res/chapter1-17.png) 197 | 198 | 在第五章会讲一个Eigen value,有些linear System它会有些特别的输入,你输入$x_1,x_2,...,x_n$,它会直接给输入乘上2倍变大。或者是你输入另外一个向量,输出会把它变小。输出和输入的形状是一样的,但是它们的scale不一样。有了这些知识以后就可以做滤波器,滤波器要做的事情就是让某种类型的讯号过去,某种类型的讯号不能过去。你有一个第五章讲的概念以后,你就可以知道咋样设计一个滤波器,让某些讯号进来可以被放大,某些讯号进来可以被缩小。设计滤波器的种种原理,会在第五章里面提到基本的概念 199 | 200 | 201 | 202 | ![](res/chapter1-18.png) 203 | 204 | 今天这堂课的结论就是:线性代数是很重要的,在电机系如果没有把线性代数学好就好像没有学念能力却到了天空们技场两百楼。 205 | 206 | 207 | 208 | 209 | 210 | 211 | 212 | 213 | 214 | 215 | 216 | 217 | 218 | 219 | 220 | 221 | 222 | 223 | 224 | 225 | 226 | 227 | 228 | 229 | 230 | -------------------------------------------------------------------------------- /docs/chapter6/chapter6.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | ![](res/chapter6-1.png) 5 | 6 | 7 | 之前已经讲了线性系统是System of Linear Equations,System of Linear Equations可以表示为matrix跟vector相乘。接下来的问题是:我们咋样知道System of Linear Equations有解还是无解。 8 | 9 | ![](res/chapter6-2.png) 10 | 11 | 12 | 我们已经System of Linear Equations跟Matrix-vector product是同一件事情。接下来要讨论的是:一个线性的系统是否有解。 13 | 14 | 假设给定A和b,接下来你要去检测在给定A跟b的前提下,x是否存在(是否能找得到一个x跟A相乘以后得到的结果是b)。或者说:System of Linear Equations所有的coefficient A,跟constant term,是否能够找到一组x,使得equations成立。在这次课中会学到连个专业词汇,一个是linear combination,第二个是span。 15 | 16 | 有没有解这个问题非常的重要,你可以假设linear system是你的电路,现在老板指定这个电路输出数值b的电流,那自己能不能找到合适的电压源和电流源,使得输出是老板指定的输出值。 17 | 18 | ## 解 19 | 20 | ![](res/chapter6-4.png) 21 | 22 | 23 | 如果一个System of Linear Equations有解,不管是有一个解或者更多的解,那我们说这个System of Linear Equations是consistent。如果的解是空集合时,那我们称这个System of Linear Equations叫做inconsistent。 24 | 25 | $\begin{matrix} 26 | 3x_1+x_2=10\\ 27 | x_1-3x_2=0 28 | \end{matrix}$只有唯一解为$\begin{bmatrix} 29 | 3\\ 30 | 1 31 | \end{bmatrix}$;$\begin{matrix} 32 | 3x_1+x_2=10\\ 33 | 6x_1+2x_2=1 34 | \end{matrix}$的解为$\begin{bmatrix} 35 | 3\\ 36 | 1 37 | \end{bmatrix}+t\begin{bmatrix} 38 | -1\\ 39 | 3 40 | \end{bmatrix}$,其中t为任何实数,解都是成立的。$\begin{matrix} 41 | 3x_1+x_2=10\\ 42 | 6x_1+2x_2=0 43 | \end{matrix}$的解为空集合。所以第一个System of Linear Equations是constant,第一个System of Linear Equations也是constant,第三个System of Linear Equations是inconstant 44 | 45 | 46 | ![](res/chapter6-5.png) 47 | 48 | 49 | 检查System of Linear Equations这件事在高中的时候也学过,你可能解过这样的问题,给定一个二元一次联立方程式$\begin{matrix} 50 | a_{11}x_1+a_{12}x_2=b_1\\ 51 | a_{21}x_1+a_{22}x_2=b_2 52 | \end{matrix}$,第一条Equations是一条直线($x_1$可以当做x轴,$x_2$当做是y轴),第二条Equations也是一条直线,将两条直线画在图中。如果这两条直线只有唯一的交点,那这个System of Linear Equations就是唯一解;如果这两条直线是平行的,那这个System of Linear Equations就是无解;如果这两条直线正好重合,那就有无穷多解。 53 | 54 | 在高中的时候是解有两个变量的方程式,你可以画在二维的平面上,解三个变量可以画在三维的空间中。你可能没有试着解过更高维的问题,在线性代数这门课里面,我们就是要解更高维的问题。 55 | 56 | ## linear combination 57 | 58 | ![](res/chapter6-6.png) 59 | 60 | 61 | 首先要讲的观念是linear combination。 62 | 63 | 64 | ![](res/chapter6-7.png) 65 | 66 | 给定一组vector set{$u_1,u_2,...u_k$},你可以对这个给定的vector set做linear combination。指定k个scalars,称作coefficient of linear combination,然后将k个vector跟k个数值相乘,得到一个新的vector v($v=c_1u_1+c_2u_2+,...,c_ku_k$)。 67 | 68 | 假定一组vector set为{$\begin{bmatrix} 69 | 1\\ 70 | 1 71 | \end{bmatrix},\begin{bmatrix} 72 | 1\\ 73 | 3 74 | \end{bmatrix},\begin{bmatrix} 75 | 1\\ 76 | -1 77 | \end{bmatrix}$},给定coefficient为{$-3,4,1$}。coefficient跟vector set做linear combination的结果为$\begin{bmatrix} 78 | 2\\ 79 | 8 80 | \end{bmatrix}$。 81 | 82 | ### 列的角度 83 | 84 | ![](res/chapter6-8.png) 85 | 86 | 87 | linear combination跟有没有解有什么关系呢,事实上之前讲matrix跟vector相乘时,已经讲过linear combination。其实矩阵跟向量相乘,就是将矩阵的每一个column做linear combination。 88 | 89 | 将所有的coefficient集合在一起当做matrix来看待(对应相同variable的coefficient集合就是每一个column),将矩阵A跟向量x相乘,得到的结果其实就是将($a_1,a_2,...a_n$)和($x_1,x_2,...,x_n$)做linear combination。matrix跟vector相乘就是将matrix的column做linear combination。 90 | 91 | 92 | 93 | ![](res/chapter6-9.png) 94 | 95 | 我们今天要讨论的问题是Ax=b有没有解,等同于solution set是否为empty,等同于这个linear combination是否为consistent。也等同于b是否为A的column linear combination 96 | 97 | ### 示例1 98 | 99 | ![](res/chapter6-10.png) 100 | 101 | 给定的linear combination($\begin{matrix} 102 | 3x_1+6x_2=3\\ 103 | 2x_1+4x_2=4 104 | \end{matrix}$)可以写做为矩阵($A=\begin{bmatrix} 105 | 3 &6 \\ 106 | 2 &4 107 | \end{bmatrix}$)跟向量($\begin{bmatrix} 108 | x_1\\ 109 | x_2 110 | \end{bmatrix}$)相乘,得到的结果为($\begin{bmatrix} 111 | 3\\ 112 | 4 113 | \end{bmatrix}$)。 114 | 115 | 接下来的问题是b是否为A的column($\begin{bmatrix} 116 | 3\\ 117 | 2 118 | \end{bmatrix},\begin{bmatrix} 119 | 6\\ 120 | 4 121 | \end{bmatrix}$),b是否为这两个column的linear combination。 122 | 123 | 124 | ![](res/chapter6-11.png) 125 | 126 | 127 | 接下来用图示的方法来解决这个问题,将vector set($\begin{bmatrix} 128 | 3\\ 129 | 2 130 | \end{bmatrix},\begin{bmatrix} 131 | 6\\ 132 | 4 133 | \end{bmatrix}$)描述在二维的平面上(未来要解更高维问题的时候,你没有办法将它画在二维的平面上,我们在这里先讨论低维的事情)。如果你用任何的coefficient和($\begin{bmatrix} 134 | 3\\ 135 | 2 136 | \end{bmatrix},\begin{bmatrix} 137 | 6\\ 138 | 4 139 | \end{bmatrix}$)做linear combination,linear combination的结果一定落在这条虚线上。 140 | 141 | 如果$\begin{bmatrix} 142 | 3\\ 143 | 4 144 | \end{bmatrix}$并没有落在这条虚线上,那它就不是($\begin{bmatrix} 145 | 3\\ 146 | 2 147 | \end{bmatrix},\begin{bmatrix} 148 | 6\\ 149 | 4 150 | \end{bmatrix}$)的linear combination,所以这个System of Linear Equations是没有解。 151 | 152 | ### 示例2 153 | 154 | ![](res/chapter6-12.png) 155 | 156 | 这个System of Linear Equations($\begin{matrix} 157 | 2x_1+3x_2=4\\ 158 | 3x_1+1x_2=-1 159 | \end{matrix}$)是否有解,有解这个陈述等同于columns($\begin{bmatrix} 160 | 2\\ 161 | 3 162 | \end{bmatrix},\begin{bmatrix} 163 | 3\\ 164 | 1 165 | \end{bmatrix}$)做linear combination能不能产生($\begin{bmatrix} 166 | 4\\ 167 | -1 168 | \end{bmatrix}$)。 169 | 170 | ![](res/chapter6-13.png) 171 | 172 | 173 | 将在图中描述($\begin{bmatrix} 174 | 2\\ 175 | 3 176 | \end{bmatrix},\begin{bmatrix} 177 | 3\\ 178 | 1 179 | \end{bmatrix}$),vector$\begin{bmatrix} 180 | 4\\ 181 | -1 182 | \end{bmatrix}$是否可以用($\begin{bmatrix} 183 | 2\\ 184 | 3 185 | \end{bmatrix},\begin{bmatrix} 186 | 3\\ 187 | 1 188 | \end{bmatrix}$)进行linear combination得到呢?这个是可以得到。将$\begin{bmatrix} 189 | 3\\ 190 | 1 191 | \end{bmatrix}$乘以2,$\begin{bmatrix} 192 | 2\\ 193 | 3 194 | \end{bmatrix}$乘以(-1),然后将其进行相加得到结果为$\begin{bmatrix} 195 | 4\\ 196 | -1 197 | \end{bmatrix}$。所以b是A的linear combination,这个System of Linear Equations是有解的。 198 | 199 | 200 | ![](res/chapter6-14.png) 201 | 202 | 203 | 204 | 如果vector u和v为互不平行的二维向量,那它们的linear combination就是整个二维空间。这就告诉我们:假设给定System of Linear Equations,matrix A中的两个vector(u,v)不是平行的,那这个System of Linear Equations不管右边的b值是什么,一定是有解的。 205 | 206 | 现在很清楚的知道在二维空间中两个非平行的vector进行linear 表示任何的vector,那我们推广至三维的空间。在三维空间中有u、v、w互不平行的三条向量,u、v、w三条互不平行的向量进行linear combination之后是不能表示三维空间中的任何一个vector。所以是不是平行来决定一组vector是否能表示整个空间的向量是不够的,所以会在下面的投影片提一个independent的概念。 207 | 208 | 我门刚才学到在二维平面中u,v非平行,就一定有解;反过来看:有解是否能保证u,v非平行,这是不成立的。 209 | 210 | ### 示例三 211 | 212 | ![](res/chapter6-15.png) 213 | 214 | 接下来我们举个例子来描述它是为什么不成立的,matrix A为$\begin{bmatrix} 215 | 2 &6 \\ 216 | 1 &3 217 | \end{bmatrix}$,b$\begin{bmatrix} 218 | -4\\ 219 | -2 220 | \end{bmatrix}$。然后我们问它有没有解,意思就是说:matrix A的columns做linear combination后有没有可能得到b 221 | 222 | ![](res/chapter6-16.png) 223 | 224 | 那我们就把matrix A的cloumns在二维的空间中做出来($\begin{bmatrix} 225 | 2\\ 226 | 1 227 | \end{bmatrix},\begin{bmatrix} 228 | 6\\ 229 | 3 230 | \end{bmatrix}$做linear combination以后有没有可能得到$\begin{bmatrix} 231 | -4\\ 232 | -2 233 | \end{bmatrix}$),$\begin{bmatrix} 234 | 2\\ 235 | 1 236 | \end{bmatrix},\begin{bmatrix} 237 | 6\\ 238 | 3 239 | \end{bmatrix}$做linear combination以后所得到的vector一定落在跟他们平行的这条虚线上,$\begin{bmatrix} 240 | -4\\ 241 | -2 242 | \end{bmatrix}$正好落在这条虚线上,所以它是有解的。如果$\begin{bmatrix} 243 | -4\\ 244 | -2 245 | \end{bmatrix}$没有落在这条虚线上,它就是没有解的。 246 | 247 | 如果u和v是非平行的(column的两个vector)一定有解,一定有解不能证明它是非平行的。 248 | 249 | ## Span 250 | 251 | ![](res/chapter6-17.png) 252 | 253 | 254 | 接下来要讲的一个观念是:Span 255 | 256 | ![](res/chapter6-18.png) 257 | 258 | 有一个vector Set S=($u_1,u_2,...u_k$),将$u_1,u_2,...u_k$做linear combination后,将linear combination可能得到的vector集合起来叫做这个vector Span(穷举所有的coefficients就可以组成各式各样的vector,你将所有组出的vector都集合起来得到一个很大的集合,这个很大的集合写做Span S) 259 | 260 | 261 | 加入有一个vector Set V正好等于Span S,我们就称vector set S是vector Set V的generating set,或者我们说:S generates V,我们可以将S当做是描述V的一个特性的方法。 262 | 263 | 264 | V是很大的vector,通常有无穷多个element,所以如果要描述它时,没有办法将里面的element都列举出来。所以如果要描述有无穷多个element的vector set时,你就可以用generates set来描述它具有咋样的特性。 265 | 266 | 267 | 268 | ![](res/chapter6-19.png) 269 | 270 | 现在有一个vector $S_0$($\begin{bmatrix} 271 | 0\\ 272 | 0 273 | \end{bmatrix}$),将这个vector$S_0$做Span后,得到新的vector也只有一个member(0乘以任何的coefficients$c_1$是0),所以进行Span之后并不是永远得到无穷多个element 274 | 275 | 276 | 现在有vector$S_1$($\begin{bmatrix} 277 | 1\\ 278 | -1 279 | \end{bmatrix}$),$S_1$进行Span以后,得到的结果是这一条黑色直线上任何的vector,这条黑色上的任何vector都可以用vector($\begin{bmatrix} 280 | 1\\ 281 | -1 282 | \end{bmatrix}$)乘以coefficient$c_1$得到。所以将$S_1$进行Span以后,得到新的vector是在这条直线上面所有vector的集合。 283 | 284 | 如果在一个vector set里面有非零的vector,若将这个vector set进行Span以后会有无穷多的vector 285 | 286 | 287 | 288 | ![](res/chapter6-20.png) 289 | 290 | 291 | 292 | 更多的示例:将vecctor($\begin{bmatrix} 293 | 1\\ 294 | -1 295 | \end{bmatrix}$)进行Span以后,得到的vector set是这条黑色的直线。现在将vector($\begin{bmatrix} 296 | 1\\ 297 | -1 298 | \end{bmatrix},\begin{bmatrix} 299 | -2\\ 300 | 2 301 | \end{bmatrix}$)进行Span以后($\begin{bmatrix} 302 | 1\\ 303 | -1 304 | \end{bmatrix}$乘以各种不同的$c_1$,$\begin{bmatrix} 305 | 2\\ 306 | -2 307 | \end{bmatrix}$乘以各种不同的$c_2$),得到的结果也会落在同一条黑色的直线上。 308 | 309 | 这个示例告诉我们Span $S_1$=Span $S_2$,所以就算是两个vector set里面的vector数目不同,两者都进行Span以后仍然有可能得到相同的vector set。 310 | 311 | 312 | ![](res/chapter6-21.png) 313 | 314 | 315 | 将vector set$S_3$,现在将$S_3$进行Span之后得到的Span $S_3$是整个二维空间中所有向量的集合($Span S_3=R^2$) 316 | 317 | 318 | ![](res/chapter6-22.png) 319 | 320 | 若vector set多加一个vector·=$\begin{bmatrix} 321 | -1\\ 322 | 3 323 | \end{bmatrix}$,Span$S_4=R^2$。接下来不管你多加多少个vector,得到的结果仍然都是 $R^2$ 324 | 325 | 326 | 327 | ![](res/chapter6-23.png) 328 | 329 | standard vector是其中一维是1,都是都为0。假设对e_2$进行Span之后,得到的结果是所有落在z轴上的vector;若将$e_1,e_2$进行Span之后,得到的结果是x,y整个平面;若将$e_1,e_2,e_3$进行Span之后,得到的结果是整个三维的空间。 330 | 331 | 332 | 333 | ![](res/chapter6-24.png) 334 | 335 | 336 | 有了Span的概念后,我们刚才的陈述:有没有解这个问题就可以换一种陈述。有没有解等于:b是不是A的linear combination,也等于b是不是在A的cloumns 的Span里面(A的column所成的集合进行Span之后得到新的vector set V,b如果在V里,就代表这个linear equation是有解的;若没有就代表无解) 337 | 338 | ## 总结 339 | 340 | ![](res/chapter6-25.png) 341 | 342 | 检验一个linear equations有没有解等同于b是不是A的column的linear combination;等同于b是不是在A的column进行Span之后所形成的vector set里面 343 | 344 | -------------------------------------------------------------------------------- /docs/chapter8/chapter8.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | ![](res/chapter8-1.png) 4 | 5 | 前面几节课的内容我们讲述了linear equation有没有解、有多少解,只是做了专业名词的定义,没有说明有什么演算法可以直接操作就知道有解还是没有解。今天要讲的内容是:按照高斯消去法进行操作就可以知道有解还是无解。 6 | 7 | ## 前人计算线性方程组书籍 8 | ### 九章算术 9 | 10 | ![](res/chapter8-2.png) 11 | 12 | 最早出现高斯消去法是在《九章算术》(汉朝)的第八章的方程章,此章投影片展示的是其中的习题。 13 | 14 | ### 算法统宗 15 | 16 | ![](res/chapter8-3.png) 17 | 18 | 在明代有另外一本书叫做《算法统宗》,通过编歌来让我们记忆。 19 | 20 | ## 等价的线性方程组 21 | 22 | ![](res/chapter8-4.png) 23 | 24 | 25 | 我们今天要学的内容是如何来解linear equation,我们要先知道一个概念是equivalent,如果两个linear equation有同样的解,那我们就说这两个linear equation是等价的。 26 | 27 | 28 | 假设有两个linear equation`$\left\{\begin{matrix} 29 | 3x_1+x_2=10\\ 30 | x_1-3x_2=0 31 | \end{matrix}\right.$`,它的解是`$\begin{bmatrix} 32 | 3\\ 33 | 1 34 | \end{bmatrix}$`,另外的两个linear equation为`$\left\{\begin{matrix} 35 | x_1=3\\ 36 | x_2=1 37 | \end{matrix}\right.$`,它的也解为`$\begin{bmatrix} 38 | 3\\ 39 | 1 40 | \end{bmatrix}$`。这个linear equation虽然不相同,但是是等价的。 41 | 42 | 43 | ![](res/chapter8-5.png) 44 | 45 | 46 | 我们若将一些linear equation做某些operations以后变为另外一个等价的linear equation,有下面的三种操作:第一个操作(Interchange)是将equation进行交换(`$\left\{\begin{matrix} 47 | 3x_1+x_2=10\\ 48 | x_1-3x_2=0 49 | \end{matrix}\right.\Rightarrow \left\{\begin{matrix} 50 | x_1-3x_2=0\\ 51 | 3x_1+x_2=10 52 | \end{matrix}\right.$`);第二个操作(Scaling)是将其中的某一个equation乘以任何的scalar(将每一个coefficients变为原来的n倍)(`$\left\{\begin{matrix} 53 | 3x_1+x_2=10\\ 54 | x_1-3x_2=0 55 | \end{matrix}\right.\Rightarrow \left\{\begin{matrix} 56 | 3x_1+x_2=10\\ 57 | -3x_1+9x_2=0 58 | \end{matrix}\right.$`),不会影响最后的solution;第三个操作(Row Addition)是将某一个equation乘以n倍再加给另外一个equation(`$\left\{\begin{matrix} 59 | 3x_1+x_2=10\\ 60 | x_1-3x_2=0 61 | \end{matrix}\right.\Rightarrow \left\{\begin{matrix} 62 | 10x_2=10\\ 63 | x_1-3x_2=0 64 | \end{matrix}\right.$`),不会影响最后的solution。 65 | 66 | 67 | ![](res/chapter8-6.png) 68 | 69 | 今天我们要解一个linear equation的策略就是:对给定的linear equation不断进行上一张投影片所讲的内容,直到它变为特别简单的linear equation。 70 | 71 | 72 | 假设有linear equations为`$\left\{\begin{matrix} 73 | x_1-3x_2=0\\ 74 | 3x_1+x_2=10 75 | \end{matrix}\right.$`,使用Row Addition变为`$\left\{\begin{matrix} 76 | x_1-3x_2=0\\ 77 | 10x_2=10 78 | \end{matrix}\right.$`,变为`$\left\{\begin{matrix} 79 | x_1-3x_2=0\\ 80 | x_2=1 81 | \end{matrix}\right.$`,变为`$\left\{\begin{matrix} 82 | x_1=3\\ 83 | x_2=1 84 | \end{matrix}\right.$` 85 | 86 | 高斯消去法的策略就是不断进行不会影响solution的operations,直到linear equation变为一秒就可以看出答案为止。 87 | 88 | ## 增广矩阵 89 | 90 | ![](res/chapter8-7.png) 91 | 92 | 93 | 我们刚才举的例子是二元的联立方程式,我们通常解的是多元的联立方程式。我们可以将linear equation的coefficients可以用matrix来进行表示,未知数可以用向量**x**来表示,等号右边的constant也可以用向量**b**来进行表示。我们可以写作为`$Ax=b$` 94 | 95 | 96 | 97 | ![](res/chapter8-8.png) 98 | 99 | 另外的一个表示方法是augmented matrix(不是用矩阵和向量相乘来进行表示),将所有的coefficients和constant摆在一起,就是augmented matrix,如图所示的augmented matrix为`$m*(n+1)$`的矩阵。 100 | 101 | 102 | ![](res/chapter8-9.png) 103 | 104 | 105 | 我们实际对linear equation进行变换的时候不会将等号和x写出来,因为这是很麻烦的(`$\left\{\begin{matrix} 106 | x_1-3x_2=0\\ 107 | 3x_1+x_2=10 108 | \end{matrix}\right.$`),我们实际上是augmented matrix上进行操作,每一个linear equation都对应一个augmented matrix(`$\begin{bmatrix} 109 | 1 &-3 &0 \\ 110 | 3 & 1 & 10 111 | \end{bmatrix}$`)。 112 | 113 | ## RREF 114 | 115 | ![](res/chapter8-10.png) 116 | 117 | 若有一个很复杂linear equation(`$Ax=b$`),它的augmented matrix `$A'$`为,接下来做以下三个操作,就可以将原来的`$A'$`变为`$A''$`,依次类推。直到它变为一个非常简单的形式(`$R=[R'b']$`),然后反推回linear equation(`$A'x=b'$`) 118 | 119 | 120 | augmented matrix经历的三个操作(Interchange,Scaling,Row Addition)称为elementary row operations,上述描述的最简单的matrix(`$R=[R'b']$`)可以让我们一眼看出linear equation对应的solution是什么,这个matrix称为reduced row echelon form(RREF) 121 | 122 | 123 | ![](res/chapter8-11.png) 124 | 125 | 126 | 我们在讲reduced row echelon form,先搞明白row echelon form是什么。一个matrix我们说它是row echelon form时,它会满足以下两个性质:第一个性质是,所有的nonzero的row都在zero的row上面(若matrix里面有全为0的row,一定要摆在matrix的最底下)。第二个性质是,将所有的row的leading entries取出来(leading entries表示为:从左向右看,第一个不为0的element是leading entries),发现leading entries排成echelon form(如果leading entries排成像阶梯的形状,那么它就是echelon form,换句话说就是每一个leading entries都是在前一个leading entries的右下角) 127 | 128 | 129 | ![](res/chapter8-12.png) 130 | 131 | 132 | 假设有一个matrix`$\begin{bmatrix} 133 | 1&0 &0 &6 & 3 &0 \\ 134 | 0& 0 & 1 & 5 &7 &0 \\ 135 | 0& 1& 0 & 2 & 4 &0 \\ 136 | 0& 0& 0& 0 & 0 & 1 137 | \end{bmatrix}$`,通过观察这个matrix满足第一个性质(没有nonzero row)。接下来我们将这个matrix里的leading entries圈出来,发现在这些leading entries里第三个row 的leading entries不是在第二个leading entries的右下方,这个matrix不满足第二条性质,所以这个matrix不是row echelon form。 138 | 139 | 140 | ![](res/chapter8-13.png) 141 | 142 | 143 | 接下来要讲的内容是reduce row echelon form,reduce row echelon form满足的第一个和第二个条件为echelon form,第三个条件为leading entries所在的那个columns必须为standard vectors(standard vector表示其中只有一维是1,其他都为0(第一个column是standard vector;第二个leading entries在第三个column,但是第三个column不是standard vector;第三个leading entries在第四个column,但是第四个column不是standard vector)),所以这个matrix不是reduced rowechelon form。 144 | 145 | 146 | ![](res/chapter8-14.png) 147 | 148 | 假设现在有一个matrix`$\begin{bmatrix} 149 | 1&-3 &0 &2 & 0 &7 \\ 150 | 0& 0 & 1 & 6 &0 &9 \\ 151 | 0& 0& 0 & 0 & 1 &2 \\ 152 | 0& 0& 0& 0 & 0 & 1 153 | \end{bmatrix}$`,若现在判断这个matrix是不是为reduced row echelon from,先用判断这个matrix是不是row echelon form。row echelon from的一个条件为,zero的row是不是在最下面,显然这个matrix是满足的;第二个条件为,每一个leading entries都是在前一个leading entries的右下角,显然这个matrix也是满足的。我们通过观察这个matrix的leading entries在standard vector里面,所以这个matrix是reduced row echelon from。 154 | 155 | ### 性质 156 | 157 | ![](res/chapter8-15.png) 158 | 159 | reduced row echelon from有一个重要的性质:每一个matrix对应一个RREF(RREF is unique)。将一个matrix进行row operation变为reduced row echelon from,不管row operation的操作顺序是如何,不管用了哪些row operation的操作,最终一个matrix的reduced row echelon from是相同的。但是row echelon form可能会有很多的不相同。 160 | 161 | 假设现有一个matrix`$\begin{bmatrix} 162 | 1 &-2 &-1 &3 \\ 163 | 3& -6 & -5 &3 \\ 164 | 2 & -1 & 1 & 0 165 | \end{bmatrix}$`,这个matrix通过row operation变为row echelon form(`$\begin{bmatrix} 166 | 1 &-2 &-1 &3 \\ 167 | 0& 3 & 3 &-6 \\ 168 | 0 & 0 & 1 & 3 169 | \end{bmatrix}$`),但是row echelon form不只是有一个,我们可以再做一些row operation变为`$\begin{bmatrix} 170 | 1 &-2 &-1 &3 \\ 171 | 0& 3 & 3 &-6 \\ 172 | 0 & 0 & 3 & 9 173 | \end{bmatrix}$`(将第三个row乘以3);我们可以再做一些row operation变为`$\begin{bmatrix} 174 | 1 &-2 &-1 &3 \\ 175 | 0& 3 & 0 &-15 \\ 176 | 0 & 0 & 3 & 9 177 | \end{bmatrix}$`(将第三个row乘以-1,加到第二个row),所以一个matrix有很多的row echelon form。但是reduced row echelon 只有唯一的。 178 | 179 | 180 | ![](res/chapter8-16.png) 181 | 182 | 183 | 184 | 现在假设matrix A为`$\begin{bmatrix} 185 | 1& 2 &-1 & 2 &1 &2 \\ 186 | -1 &-2 & 1& 2 &3 &6 \\ 187 | 2 & 4 & -3 & 2 & 0&3 \\ 188 | -3 & -6 & 2 & 0 & 3 &9 189 | \end{bmatrix}$`变为reduced row echelon form(`$\begin{bmatrix} 190 | 1& 2 &0 & 0 &-1 &-5 \\ 191 | 0 &0 & 1& 0 &0 &-3 \\ 192 | 0 & 0 & 0 & 1 & 1&2 \\ 193 | 0 & 0 & 0 & 0 & 0 &0 194 | \end{bmatrix}$`)。如果我们找出reduced row echelon form里面的leading entry的位置((1, 1), (2, 3), (3, 4)),将这些位置对应到原来matrix的位置上。这些位置被称为pivot positions,pivot positions所在的column称为pivot columns。 195 | 196 | ### RREF的解 197 | 198 | ![](res/chapter8-17.png) 199 | 200 | 201 | 接下来要讲的内容是给定一个reduced row echelon form,咋样看出它的solution。 202 | 203 | Unique Solution:给定一个reduced row echelon form`$\begin{bmatrix} 204 | 1 &0 &0 &-4 \\ 205 | 0 & 1 &0 &-5 \\ 206 | 1 & 0 &1 & 3 207 | \end{bmatrix}$`,将它转回linear equation,然后它的解很容易就可以看出`$\left\{\begin{matrix} 208 | x_1=-4\\ 209 | x_2=-5\\ 210 | x_3=3 211 | \end{matrix}\right.$`。从这个示例中得到的结论为:如果reduced row echelon form去掉最后一个column以后,就变为一个identity matrix(斜对角的element为1,其它为0)表示就是有唯一解。 212 | 213 | 214 | 215 | ![](res/chapter8-18.png) 216 | 217 | 给定一个reduced row echelon form`$\begin{bmatrix} 218 | 1 & -1 &0 &2 &0 &7 \\ 219 | 0 &0 & 1 & 6 & 0 &9 \\ 220 | 0 &0 & 0 & 0 & 1 &2 \\ 221 | 0 &0 & 0 & 0 & 0 &0 222 | \end{bmatrix}$`,将它还原为linear equation 223 | 变为`$\begin{bmatrix} 224 | x_1-3x_2+2x_4=7\\ 225 | x_3+6x_4=9\\ 226 | x_5=2\\ 227 | 0=0 228 | \end{bmatrix}$`。除以`$x_5=2$`无法确定其它的variable数值等于多少,经变化为`$x_3=9-6x_4,x_1=7+3x_2-2x_4$`。我们将`$x_2, x_4$`称为Free variables,称`$x_1, x_3, x_5$`为Basic variables。根据上述式子`$\left\{\begin{matrix} 229 | x_1=7+3x_2-2x_4\\ 230 | x_2 ,free\\ 231 | x_3=9-6x_4\\ 232 | x_4, free\\ 233 | x_5=2 234 | \end{matrix}\right.$`可以写出parametric Representation。如果我们在linear equation 有Free variables时,就代表有无穷多解(不同的`$x_2, x_4$`就有不同的`$x_1, x_3$`)。将上述的式子描述为parametric Representation为`$\begin{bmatrix} 235 | x_1\\ 236 | x_2\\ 237 | x_3\\ 238 | x_4\\ 239 | x_5 240 | \end{bmatrix}=\begin{bmatrix} 241 | 7+3x_2-2x_4\\ 242 | x_2\\ 243 | 9-6x_4\\ 244 | x_4\\ 245 | 2 246 | \end{bmatrix}=\begin{bmatrix} 247 | 7\\ 248 | 0\\ 249 | 9\\ 250 | 0\\ 251 | 2 252 | \end{bmatrix}+x_2\begin{bmatrix} 253 | 3\\ 254 | 1\\ 255 | 0\\ 256 | 0\\ 257 | 0 258 | \end{bmatrix}+x_4\begin{bmatrix} 259 | -2\\ 260 | 0\\ 261 | -6\\ 262 | 1\\ 263 | 0 264 | \end{bmatrix}$` 265 | 266 | 267 | 268 | ![](res/chapter8-19.png) 269 | 270 | 271 | 假设有一个reduced row echelon form为`$\begin{bmatrix} 272 | 1 &0 &-3 &0 \\ 273 | 0 &1 & 2 &0 \\ 274 | 0 &0 & 0 &1 \\ 275 | 0 &0 &0 &0 276 | \end{bmatrix}$`,然后将它还原为linear equation`$\left\{\begin{matrix} 277 | x_1-3x_3=0\\ 278 | x_2+2x_3=0\\ 279 | 0x_1+0x_2+0x_3=1\\ 280 | 0x_1+0x_2+0x_3=0 281 | \end{matrix}\right.$`,这时候发现第三个row为`$0x_1+0x_2+0x_3=1$`,这显然是不成立,所以这个linear equation 是无解的。 282 | 283 | 若发现某一个row只有最后一维非0,那这个linear equation就是inconsistent 284 | 285 | ## 高斯消去原则 286 | 287 | ![](res/chapter8-20.png) 288 | 289 | 290 | 整个高斯消去法的大原则为:将原来的augmented matrix做一连串的elementary row operations变成row echelon form,在做一连串的elementary row operations变成reduced row echelon form 291 | 292 | 293 | 294 | 295 | 296 | 297 | 298 | 299 | 300 | 301 | 302 | 303 | 304 | 305 | 306 | 307 | 308 | 309 | 310 | 311 | 312 | 313 | 314 | 315 | 316 | 317 | 318 | 319 | 320 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | GNU GENERAL PUBLIC LICENSE 2 | Version 3, 29 June 2007 3 | 4 | Copyright (C) 2007 Free Software Foundation, Inc. 5 | Everyone is permitted to copy and distribute verbatim copies 6 | of this license document, but changing it is not allowed. 7 | 8 | Preamble 9 | 10 | The GNU General Public License is a free, copyleft license for 11 | software and other kinds of works. 12 | 13 | The licenses for most software and other practical works are designed 14 | to take away your freedom to share and change the works. By contrast, 15 | the GNU General Public License is intended to guarantee your freedom to 16 | share and change all versions of a program--to make sure it remains free 17 | software for all its users. We, the Free Software Foundation, use the 18 | GNU General Public License for most of our software; it applies also to 19 | any other work released this way by its authors. You can apply it to 20 | your programs, too. 21 | 22 | When we speak of free software, we are referring to freedom, not 23 | price. Our General Public Licenses are designed to make sure that you 24 | have the freedom to distribute copies of free software (and charge for 25 | them if you wish), that you receive source code or can get it if you 26 | want it, that you can change the software or use pieces of it in new 27 | free programs, and that you know you can do these things. 28 | 29 | To protect your rights, we need to prevent others from denying you 30 | these rights or asking you to surrender the rights. Therefore, you have 31 | certain responsibilities if you distribute copies of the software, or if 32 | you modify it: responsibilities to respect the freedom of others. 33 | 34 | For example, if you distribute copies of such a program, whether 35 | gratis or for a fee, you must pass on to the recipients the same 36 | freedoms that you received. You must make sure that they, too, receive 37 | or can get the source code. And you must show them these terms so they 38 | know their rights. 39 | 40 | Developers that use the GNU GPL protect your rights with two steps: 41 | (1) assert copyright on the software, and (2) offer you this License 42 | giving you legal permission to copy, distribute and/or modify it. 43 | 44 | For the developers' and authors' protection, the GPL clearly explains 45 | that there is no warranty for this free software. For both users' and 46 | authors' sake, the GPL requires that modified versions be marked as 47 | changed, so that their problems will not be attributed erroneously to 48 | authors of previous versions. 49 | 50 | Some devices are designed to deny users access to install or run 51 | modified versions of the software inside them, although the manufacturer 52 | can do so. This is fundamentally incompatible with the aim of 53 | protecting users' freedom to change the software. The systematic 54 | pattern of such abuse occurs in the area of products for individuals to 55 | use, which is precisely where it is most unacceptable. Therefore, we 56 | have designed this version of the GPL to prohibit the practice for those 57 | products. If such problems arise substantially in other domains, we 58 | stand ready to extend this provision to those domains in future versions 59 | of the GPL, as needed to protect the freedom of users. 60 | 61 | Finally, every program is threatened constantly by software patents. 62 | States should not allow patents to restrict development and use of 63 | software on general-purpose computers, but in those that do, we wish to 64 | avoid the special danger that patents applied to a free program could 65 | make it effectively proprietary. To prevent this, the GPL assures that 66 | patents cannot be used to render the program non-free. 67 | 68 | The precise terms and conditions for copying, distribution and 69 | modification follow. 70 | 71 | TERMS AND CONDITIONS 72 | 73 | 0. Definitions. 74 | 75 | "This License" refers to version 3 of the GNU General Public License. 76 | 77 | "Copyright" also means copyright-like laws that apply to other kinds of 78 | works, such as semiconductor masks. 79 | 80 | "The Program" refers to any copyrightable work licensed under this 81 | License. Each licensee is addressed as "you". "Licensees" and 82 | "recipients" may be individuals or organizations. 83 | 84 | To "modify" a work means to copy from or adapt all or part of the work 85 | in a fashion requiring copyright permission, other than the making of an 86 | exact copy. The resulting work is called a "modified version" of the 87 | earlier work or a work "based on" the earlier work. 88 | 89 | A "covered work" means either the unmodified Program or a work based 90 | on the Program. 91 | 92 | To "propagate" a work means to do anything with it that, without 93 | permission, would make you directly or secondarily liable for 94 | infringement under applicable copyright law, except executing it on a 95 | computer or modifying a private copy. Propagation includes copying, 96 | distribution (with or without modification), making available to the 97 | public, and in some countries other activities as well. 98 | 99 | To "convey" a work means any kind of propagation that enables other 100 | parties to make or receive copies. Mere interaction with a user through 101 | a computer network, with no transfer of a copy, is not conveying. 102 | 103 | An interactive user interface displays "Appropriate Legal Notices" 104 | to the extent that it includes a convenient and prominently visible 105 | feature that (1) displays an appropriate copyright notice, and (2) 106 | tells the user that there is no warranty for the work (except to the 107 | extent that warranties are provided), that licensees may convey the 108 | work under this License, and how to view a copy of this License. If 109 | the interface presents a list of user commands or options, such as a 110 | menu, a prominent item in the list meets this criterion. 111 | 112 | 1. Source Code. 113 | 114 | The "source code" for a work means the preferred form of the work 115 | for making modifications to it. "Object code" means any non-source 116 | form of a work. 117 | 118 | A "Standard Interface" means an interface that either is an official 119 | standard defined by a recognized standards body, or, in the case of 120 | interfaces specified for a particular programming language, one that 121 | is widely used among developers working in that language. 122 | 123 | The "System Libraries" of an executable work include anything, other 124 | than the work as a whole, that (a) is included in the normal form of 125 | packaging a Major Component, but which is not part of that Major 126 | Component, and (b) serves only to enable use of the work with that 127 | Major Component, or to implement a Standard Interface for which an 128 | implementation is available to the public in source code form. A 129 | "Major Component", in this context, means a major essential component 130 | (kernel, window system, and so on) of the specific operating system 131 | (if any) on which the executable work runs, or a compiler used to 132 | produce the work, or an object code interpreter used to run it. 133 | 134 | The "Corresponding Source" for a work in object code form means all 135 | the source code needed to generate, install, and (for an executable 136 | work) run the object code and to modify the work, including scripts to 137 | control those activities. However, it does not include the work's 138 | System Libraries, or general-purpose tools or generally available free 139 | programs which are used unmodified in performing those activities but 140 | which are not part of the work. For example, Corresponding Source 141 | includes interface definition files associated with source files for 142 | the work, and the source code for shared libraries and dynamically 143 | linked subprograms that the work is specifically designed to require, 144 | such as by intimate data communication or control flow between those 145 | subprograms and other parts of the work. 146 | 147 | The Corresponding Source need not include anything that users 148 | can regenerate automatically from other parts of the Corresponding 149 | Source. 150 | 151 | The Corresponding Source for a work in source code form is that 152 | same work. 153 | 154 | 2. Basic Permissions. 155 | 156 | All rights granted under this License are granted for the term of 157 | copyright on the Program, and are irrevocable provided the stated 158 | conditions are met. This License explicitly affirms your unlimited 159 | permission to run the unmodified Program. The output from running a 160 | covered work is covered by this License only if the output, given its 161 | content, constitutes a covered work. This License acknowledges your 162 | rights of fair use or other equivalent, as provided by copyright law. 163 | 164 | You may make, run and propagate covered works that you do not 165 | convey, without conditions so long as your license otherwise remains 166 | in force. You may convey covered works to others for the sole purpose 167 | of having them make modifications exclusively for you, or provide you 168 | with facilities for running those works, provided that you comply with 169 | the terms of this License in conveying all material for which you do 170 | not control copyright. Those thus making or running the covered works 171 | for you must do so exclusively on your behalf, under your direction 172 | and control, on terms that prohibit them from making any copies of 173 | your copyrighted material outside their relationship with you. 174 | 175 | Conveying under any other circumstances is permitted solely under 176 | the conditions stated below. Sublicensing is not allowed; section 10 177 | makes it unnecessary. 178 | 179 | 3. Protecting Users' Legal Rights From Anti-Circumvention Law. 180 | 181 | No covered work shall be deemed part of an effective technological 182 | measure under any applicable law fulfilling obligations under article 183 | 11 of the WIPO copyright treaty adopted on 20 December 1996, or 184 | similar laws prohibiting or restricting circumvention of such 185 | measures. 186 | 187 | When you convey a covered work, you waive any legal power to forbid 188 | circumvention of technological measures to the extent such circumvention 189 | is effected by exercising rights under this License with respect to 190 | the covered work, and you disclaim any intention to limit operation or 191 | modification of the work as a means of enforcing, against the work's 192 | users, your or third parties' legal rights to forbid circumvention of 193 | technological measures. 194 | 195 | 4. Conveying Verbatim Copies. 196 | 197 | You may convey verbatim copies of the Program's source code as you 198 | receive it, in any medium, provided that you conspicuously and 199 | appropriately publish on each copy an appropriate copyright notice; 200 | keep intact all notices stating that this License and any 201 | non-permissive terms added in accord with section 7 apply to the code; 202 | keep intact all notices of the absence of any warranty; and give all 203 | recipients a copy of this License along with the Program. 204 | 205 | You may charge any price or no price for each copy that you convey, 206 | and you may offer support or warranty protection for a fee. 207 | 208 | 5. Conveying Modified Source Versions. 209 | 210 | You may convey a work based on the Program, or the modifications to 211 | produce it from the Program, in the form of source code under the 212 | terms of section 4, provided that you also meet all of these conditions: 213 | 214 | a) The work must carry prominent notices stating that you modified 215 | it, and giving a relevant date. 216 | 217 | b) The work must carry prominent notices stating that it is 218 | released under this License and any conditions added under section 219 | 7. This requirement modifies the requirement in section 4 to 220 | "keep intact all notices". 221 | 222 | c) You must license the entire work, as a whole, under this 223 | License to anyone who comes into possession of a copy. This 224 | License will therefore apply, along with any applicable section 7 225 | additional terms, to the whole of the work, and all its parts, 226 | regardless of how they are packaged. This License gives no 227 | permission to license the work in any other way, but it does not 228 | invalidate such permission if you have separately received it. 229 | 230 | d) If the work has interactive user interfaces, each must display 231 | Appropriate Legal Notices; however, if the Program has interactive 232 | interfaces that do not display Appropriate Legal Notices, your 233 | work need not make them do so. 234 | 235 | A compilation of a covered work with other separate and independent 236 | works, which are not by their nature extensions of the covered work, 237 | and which are not combined with it such as to form a larger program, 238 | in or on a volume of a storage or distribution medium, is called an 239 | "aggregate" if the compilation and its resulting copyright are not 240 | used to limit the access or legal rights of the compilation's users 241 | beyond what the individual works permit. Inclusion of a covered work 242 | in an aggregate does not cause this License to apply to the other 243 | parts of the aggregate. 244 | 245 | 6. Conveying Non-Source Forms. 246 | 247 | You may convey a covered work in object code form under the terms 248 | of sections 4 and 5, provided that you also convey the 249 | machine-readable Corresponding Source under the terms of this License, 250 | in one of these ways: 251 | 252 | a) Convey the object code in, or embodied in, a physical product 253 | (including a physical distribution medium), accompanied by the 254 | Corresponding Source fixed on a durable physical medium 255 | customarily used for software interchange. 256 | 257 | b) Convey the object code in, or embodied in, a physical product 258 | (including a physical distribution medium), accompanied by a 259 | written offer, valid for at least three years and valid for as 260 | long as you offer spare parts or customer support for that product 261 | model, to give anyone who possesses the object code either (1) a 262 | copy of the Corresponding Source for all the software in the 263 | product that is covered by this License, on a durable physical 264 | medium customarily used for software interchange, for a price no 265 | more than your reasonable cost of physically performing this 266 | conveying of source, or (2) access to copy the 267 | Corresponding Source from a network server at no charge. 268 | 269 | c) Convey individual copies of the object code with a copy of the 270 | written offer to provide the Corresponding Source. This 271 | alternative is allowed only occasionally and noncommercially, and 272 | only if you received the object code with such an offer, in accord 273 | with subsection 6b. 274 | 275 | d) Convey the object code by offering access from a designated 276 | place (gratis or for a charge), and offer equivalent access to the 277 | Corresponding Source in the same way through the same place at no 278 | further charge. You need not require recipients to copy the 279 | Corresponding Source along with the object code. If the place to 280 | copy the object code is a network server, the Corresponding Source 281 | may be on a different server (operated by you or a third party) 282 | that supports equivalent copying facilities, provided you maintain 283 | clear directions next to the object code saying where to find the 284 | Corresponding Source. Regardless of what server hosts the 285 | Corresponding Source, you remain obligated to ensure that it is 286 | available for as long as needed to satisfy these requirements. 287 | 288 | e) Convey the object code using peer-to-peer transmission, provided 289 | you inform other peers where the object code and Corresponding 290 | Source of the work are being offered to the general public at no 291 | charge under subsection 6d. 292 | 293 | A separable portion of the object code, whose source code is excluded 294 | from the Corresponding Source as a System Library, need not be 295 | included in conveying the object code work. 296 | 297 | A "User Product" is either (1) a "consumer product", which means any 298 | tangible personal property which is normally used for personal, family, 299 | or household purposes, or (2) anything designed or sold for incorporation 300 | into a dwelling. In determining whether a product is a consumer product, 301 | doubtful cases shall be resolved in favor of coverage. For a particular 302 | product received by a particular user, "normally used" refers to a 303 | typical or common use of that class of product, regardless of the status 304 | of the particular user or of the way in which the particular user 305 | actually uses, or expects or is expected to use, the product. A product 306 | is a consumer product regardless of whether the product has substantial 307 | commercial, industrial or non-consumer uses, unless such uses represent 308 | the only significant mode of use of the product. 309 | 310 | "Installation Information" for a User Product means any methods, 311 | procedures, authorization keys, or other information required to install 312 | and execute modified versions of a covered work in that User Product from 313 | a modified version of its Corresponding Source. The information must 314 | suffice to ensure that the continued functioning of the modified object 315 | code is in no case prevented or interfered with solely because 316 | modification has been made. 317 | 318 | If you convey an object code work under this section in, or with, or 319 | specifically for use in, a User Product, and the conveying occurs as 320 | part of a transaction in which the right of possession and use of the 321 | User Product is transferred to the recipient in perpetuity or for a 322 | fixed term (regardless of how the transaction is characterized), the 323 | Corresponding Source conveyed under this section must be accompanied 324 | by the Installation Information. But this requirement does not apply 325 | if neither you nor any third party retains the ability to install 326 | modified object code on the User Product (for example, the work has 327 | been installed in ROM). 328 | 329 | The requirement to provide Installation Information does not include a 330 | requirement to continue to provide support service, warranty, or updates 331 | for a work that has been modified or installed by the recipient, or for 332 | the User Product in which it has been modified or installed. Access to a 333 | network may be denied when the modification itself materially and 334 | adversely affects the operation of the network or violates the rules and 335 | protocols for communication across the network. 336 | 337 | Corresponding Source conveyed, and Installation Information provided, 338 | in accord with this section must be in a format that is publicly 339 | documented (and with an implementation available to the public in 340 | source code form), and must require no special password or key for 341 | unpacking, reading or copying. 342 | 343 | 7. Additional Terms. 344 | 345 | "Additional permissions" are terms that supplement the terms of this 346 | License by making exceptions from one or more of its conditions. 347 | Additional permissions that are applicable to the entire Program shall 348 | be treated as though they were included in this License, to the extent 349 | that they are valid under applicable law. If additional permissions 350 | apply only to part of the Program, that part may be used separately 351 | under those permissions, but the entire Program remains governed by 352 | this License without regard to the additional permissions. 353 | 354 | When you convey a copy of a covered work, you may at your option 355 | remove any additional permissions from that copy, or from any part of 356 | it. (Additional permissions may be written to require their own 357 | removal in certain cases when you modify the work.) You may place 358 | additional permissions on material, added by you to a covered work, 359 | for which you have or can give appropriate copyright permission. 360 | 361 | Notwithstanding any other provision of this License, for material you 362 | add to a covered work, you may (if authorized by the copyright holders of 363 | that material) supplement the terms of this License with terms: 364 | 365 | a) Disclaiming warranty or limiting liability differently from the 366 | terms of sections 15 and 16 of this License; or 367 | 368 | b) Requiring preservation of specified reasonable legal notices or 369 | author attributions in that material or in the Appropriate Legal 370 | Notices displayed by works containing it; or 371 | 372 | c) Prohibiting misrepresentation of the origin of that material, or 373 | requiring that modified versions of such material be marked in 374 | reasonable ways as different from the original version; or 375 | 376 | d) Limiting the use for publicity purposes of names of licensors or 377 | authors of the material; or 378 | 379 | e) Declining to grant rights under trademark law for use of some 380 | trade names, trademarks, or service marks; or 381 | 382 | f) Requiring indemnification of licensors and authors of that 383 | material by anyone who conveys the material (or modified versions of 384 | it) with contractual assumptions of liability to the recipient, for 385 | any liability that these contractual assumptions directly impose on 386 | those licensors and authors. 387 | 388 | All other non-permissive additional terms are considered "further 389 | restrictions" within the meaning of section 10. If the Program as you 390 | received it, or any part of it, contains a notice stating that it is 391 | governed by this License along with a term that is a further 392 | restriction, you may remove that term. If a license document contains 393 | a further restriction but permits relicensing or conveying under this 394 | License, you may add to a covered work material governed by the terms 395 | of that license document, provided that the further restriction does 396 | not survive such relicensing or conveying. 397 | 398 | If you add terms to a covered work in accord with this section, you 399 | must place, in the relevant source files, a statement of the 400 | additional terms that apply to those files, or a notice indicating 401 | where to find the applicable terms. 402 | 403 | Additional terms, permissive or non-permissive, may be stated in the 404 | form of a separately written license, or stated as exceptions; 405 | the above requirements apply either way. 406 | 407 | 8. Termination. 408 | 409 | You may not propagate or modify a covered work except as expressly 410 | provided under this License. Any attempt otherwise to propagate or 411 | modify it is void, and will automatically terminate your rights under 412 | this License (including any patent licenses granted under the third 413 | paragraph of section 11). 414 | 415 | However, if you cease all violation of this License, then your 416 | license from a particular copyright holder is reinstated (a) 417 | provisionally, unless and until the copyright holder explicitly and 418 | finally terminates your license, and (b) permanently, if the copyright 419 | holder fails to notify you of the violation by some reasonable means 420 | prior to 60 days after the cessation. 421 | 422 | Moreover, your license from a particular copyright holder is 423 | reinstated permanently if the copyright holder notifies you of the 424 | violation by some reasonable means, this is the first time you have 425 | received notice of violation of this License (for any work) from that 426 | copyright holder, and you cure the violation prior to 30 days after 427 | your receipt of the notice. 428 | 429 | Termination of your rights under this section does not terminate the 430 | licenses of parties who have received copies or rights from you under 431 | this License. If your rights have been terminated and not permanently 432 | reinstated, you do not qualify to receive new licenses for the same 433 | material under section 10. 434 | 435 | 9. Acceptance Not Required for Having Copies. 436 | 437 | You are not required to accept this License in order to receive or 438 | run a copy of the Program. Ancillary propagation of a covered work 439 | occurring solely as a consequence of using peer-to-peer transmission 440 | to receive a copy likewise does not require acceptance. However, 441 | nothing other than this License grants you permission to propagate or 442 | modify any covered work. These actions infringe copyright if you do 443 | not accept this License. Therefore, by modifying or propagating a 444 | covered work, you indicate your acceptance of this License to do so. 445 | 446 | 10. Automatic Licensing of Downstream Recipients. 447 | 448 | Each time you convey a covered work, the recipient automatically 449 | receives a license from the original licensors, to run, modify and 450 | propagate that work, subject to this License. You are not responsible 451 | for enforcing compliance by third parties with this License. 452 | 453 | An "entity transaction" is a transaction transferring control of an 454 | organization, or substantially all assets of one, or subdividing an 455 | organization, or merging organizations. If propagation of a covered 456 | work results from an entity transaction, each party to that 457 | transaction who receives a copy of the work also receives whatever 458 | licenses to the work the party's predecessor in interest had or could 459 | give under the previous paragraph, plus a right to possession of the 460 | Corresponding Source of the work from the predecessor in interest, if 461 | the predecessor has it or can get it with reasonable efforts. 462 | 463 | You may not impose any further restrictions on the exercise of the 464 | rights granted or affirmed under this License. For example, you may 465 | not impose a license fee, royalty, or other charge for exercise of 466 | rights granted under this License, and you may not initiate litigation 467 | (including a cross-claim or counterclaim in a lawsuit) alleging that 468 | any patent claim is infringed by making, using, selling, offering for 469 | sale, or importing the Program or any portion of it. 470 | 471 | 11. Patents. 472 | 473 | A "contributor" is a copyright holder who authorizes use under this 474 | License of the Program or a work on which the Program is based. The 475 | work thus licensed is called the contributor's "contributor version". 476 | 477 | A contributor's "essential patent claims" are all patent claims 478 | owned or controlled by the contributor, whether already acquired or 479 | hereafter acquired, that would be infringed by some manner, permitted 480 | by this License, of making, using, or selling its contributor version, 481 | but do not include claims that would be infringed only as a 482 | consequence of further modification of the contributor version. For 483 | purposes of this definition, "control" includes the right to grant 484 | patent sublicenses in a manner consistent with the requirements of 485 | this License. 486 | 487 | Each contributor grants you a non-exclusive, worldwide, royalty-free 488 | patent license under the contributor's essential patent claims, to 489 | make, use, sell, offer for sale, import and otherwise run, modify and 490 | propagate the contents of its contributor version. 491 | 492 | In the following three paragraphs, a "patent license" is any express 493 | agreement or commitment, however denominated, not to enforce a patent 494 | (such as an express permission to practice a patent or covenant not to 495 | sue for patent infringement). To "grant" such a patent license to a 496 | party means to make such an agreement or commitment not to enforce a 497 | patent against the party. 498 | 499 | If you convey a covered work, knowingly relying on a patent license, 500 | and the Corresponding Source of the work is not available for anyone 501 | to copy, free of charge and under the terms of this License, through a 502 | publicly available network server or other readily accessible means, 503 | then you must either (1) cause the Corresponding Source to be so 504 | available, or (2) arrange to deprive yourself of the benefit of the 505 | patent license for this particular work, or (3) arrange, in a manner 506 | consistent with the requirements of this License, to extend the patent 507 | license to downstream recipients. "Knowingly relying" means you have 508 | actual knowledge that, but for the patent license, your conveying the 509 | covered work in a country, or your recipient's use of the covered work 510 | in a country, would infringe one or more identifiable patents in that 511 | country that you have reason to believe are valid. 512 | 513 | If, pursuant to or in connection with a single transaction or 514 | arrangement, you convey, or propagate by procuring conveyance of, a 515 | covered work, and grant a patent license to some of the parties 516 | receiving the covered work authorizing them to use, propagate, modify 517 | or convey a specific copy of the covered work, then the patent license 518 | you grant is automatically extended to all recipients of the covered 519 | work and works based on it. 520 | 521 | A patent license is "discriminatory" if it does not include within 522 | the scope of its coverage, prohibits the exercise of, or is 523 | conditioned on the non-exercise of one or more of the rights that are 524 | specifically granted under this License. You may not convey a covered 525 | work if you are a party to an arrangement with a third party that is 526 | in the business of distributing software, under which you make payment 527 | to the third party based on the extent of your activity of conveying 528 | the work, and under which the third party grants, to any of the 529 | parties who would receive the covered work from you, a discriminatory 530 | patent license (a) in connection with copies of the covered work 531 | conveyed by you (or copies made from those copies), or (b) primarily 532 | for and in connection with specific products or compilations that 533 | contain the covered work, unless you entered into that arrangement, 534 | or that patent license was granted, prior to 28 March 2007. 535 | 536 | Nothing in this License shall be construed as excluding or limiting 537 | any implied license or other defenses to infringement that may 538 | otherwise be available to you under applicable patent law. 539 | 540 | 12. No Surrender of Others' Freedom. 541 | 542 | If conditions are imposed on you (whether by court order, agreement or 543 | otherwise) that contradict the conditions of this License, they do not 544 | excuse you from the conditions of this License. If you cannot convey a 545 | covered work so as to satisfy simultaneously your obligations under this 546 | License and any other pertinent obligations, then as a consequence you may 547 | not convey it at all. For example, if you agree to terms that obligate you 548 | to collect a royalty for further conveying from those to whom you convey 549 | the Program, the only way you could satisfy both those terms and this 550 | License would be to refrain entirely from conveying the Program. 551 | 552 | 13. Use with the GNU Affero General Public License. 553 | 554 | Notwithstanding any other provision of this License, you have 555 | permission to link or combine any covered work with a work licensed 556 | under version 3 of the GNU Affero General Public License into a single 557 | combined work, and to convey the resulting work. The terms of this 558 | License will continue to apply to the part which is the covered work, 559 | but the special requirements of the GNU Affero General Public License, 560 | section 13, concerning interaction through a network will apply to the 561 | combination as such. 562 | 563 | 14. Revised Versions of this License. 564 | 565 | The Free Software Foundation may publish revised and/or new versions of 566 | the GNU General Public License from time to time. Such new versions will 567 | be similar in spirit to the present version, but may differ in detail to 568 | address new problems or concerns. 569 | 570 | Each version is given a distinguishing version number. If the 571 | Program specifies that a certain numbered version of the GNU General 572 | Public License "or any later version" applies to it, you have the 573 | option of following the terms and conditions either of that numbered 574 | version or of any later version published by the Free Software 575 | Foundation. If the Program does not specify a version number of the 576 | GNU General Public License, you may choose any version ever published 577 | by the Free Software Foundation. 578 | 579 | If the Program specifies that a proxy can decide which future 580 | versions of the GNU General Public License can be used, that proxy's 581 | public statement of acceptance of a version permanently authorizes you 582 | to choose that version for the Program. 583 | 584 | Later license versions may give you additional or different 585 | permissions. However, no additional obligations are imposed on any 586 | author or copyright holder as a result of your choosing to follow a 587 | later version. 588 | 589 | 15. Disclaimer of Warranty. 590 | 591 | THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY 592 | APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT 593 | HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY 594 | OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, 595 | THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR 596 | PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM 597 | IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF 598 | ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 599 | 600 | 16. Limitation of Liability. 601 | 602 | IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING 603 | WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS 604 | THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY 605 | GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE 606 | USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF 607 | DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD 608 | PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), 609 | EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF 610 | SUCH DAMAGES. 611 | 612 | 17. Interpretation of Sections 15 and 16. 613 | 614 | If the disclaimer of warranty and limitation of liability provided 615 | above cannot be given local legal effect according to their terms, 616 | reviewing courts shall apply local law that most closely approximates 617 | an absolute waiver of all civil liability in connection with the 618 | Program, unless a warranty or assumption of liability accompanies a 619 | copy of the Program in return for a fee. 620 | 621 | END OF TERMS AND CONDITIONS 622 | 623 | How to Apply These Terms to Your New Programs 624 | 625 | If you develop a new program, and you want it to be of the greatest 626 | possible use to the public, the best way to achieve this is to make it 627 | free software which everyone can redistribute and change under these terms. 628 | 629 | To do so, attach the following notices to the program. It is safest 630 | to attach them to the start of each source file to most effectively 631 | state the exclusion of warranty; and each file should have at least 632 | the "copyright" line and a pointer to where the full notice is found. 633 | 634 | 635 | Copyright (C) 636 | 637 | This program is free software: you can redistribute it and/or modify 638 | it under the terms of the GNU General Public License as published by 639 | the Free Software Foundation, either version 3 of the License, or 640 | (at your option) any later version. 641 | 642 | This program is distributed in the hope that it will be useful, 643 | but WITHOUT ANY WARRANTY; without even the implied warranty of 644 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the 645 | GNU General Public License for more details. 646 | 647 | You should have received a copy of the GNU General Public License 648 | along with this program. If not, see . 649 | 650 | Also add information on how to contact you by electronic and paper mail. 651 | 652 | If the program does terminal interaction, make it output a short 653 | notice like this when it starts in an interactive mode: 654 | 655 | Copyright (C) 656 | This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'. 657 | This is free software, and you are welcome to redistribute it 658 | under certain conditions; type `show c' for details. 659 | 660 | The hypothetical commands `show w' and `show c' should show the appropriate 661 | parts of the General Public License. Of course, your program's commands 662 | might be different; for a GUI interface, you would use an "about box". 663 | 664 | You should also get your employer (if you work as a programmer) or school, 665 | if any, to sign a "copyright disclaimer" for the program, if necessary. 666 | For more information on this, and how to apply and follow the GNU GPL, see 667 | . 668 | 669 | The GNU General Public License does not permit incorporating your program 670 | into proprietary programs. If your program is a subroutine library, you 671 | may consider it more useful to permit linking proprietary applications with 672 | the library. If this is what you want to do, use the GNU Lesser General 673 | Public License instead of this License. But first, please read 674 | . 675 | --------------------------------------------------------------------------------