├── GSoC-2021 └── OBF-TanishqGupta-GtfBase │ └── media │ ├── image1.png │ ├── image11.png │ ├── image12.png │ ├── image14.png │ ├── image2.png │ ├── image3.png │ ├── image4.png │ ├── image5.png │ ├── image6.png │ ├── image7.png │ ├── image8.png │ └── image9.png ├── GSoC-2014 └── Accepted │ ├── BioJS-SaketChoudhary-HumanGV │ └── media │ │ ├── image_0.png │ │ ├── image_1.png │ │ ├── image_2.png │ │ ├── image_3.png │ │ └── image_4.png │ ├── KDE-AnujPahuja-KDE-Games-to-KF5 │ └── GSoC-KDE.md │ └── Mozilla-ManishG-Servo │ └── Mozilla-ManishG-Servo.md ├── GSoC-2023 └── Accepted │ ├── PostgreSQL-RajivHarlalka-pg_statviz │ └── media │ │ └── ss.png │ └── PostgreSQL-IshaanAdarsh-postgres-extension-tutorial │ └── PostgreSQL-IshaanAdarsh-postgres-extension-tutorial.md ├── GSoC-2016 └── Accepted │ ├── MovingBlocks-rzats-Visual-NUI-Editor │ └── media │ │ ├── image_0.png │ │ ├── image_1.png │ │ ├── image_2.png │ │ ├── image_3.png │ │ ├── image_4.png │ │ ├── image_5.png │ │ ├── image_6.png │ │ └── image_7.png │ ├── lowRISC-omasanori-Porting-musl-libc-to-RISC-V │ └── lowRISC-omasanori-Porting-musl-libc-to-RISC-V.md │ ├── PSF-Kivy-udiboy1209-Tiled-Integration-With-KivEnt │ └── PSF-Kivy-udiboy1209-Tiled-Integration-With-KivEnt.md │ └── PSF-MeetPragneshShah-RISCV-myHDL │ └── PSF-MeetPragneshShah-RISCV-myhdl.md ├── GSoC-2024 └── accepted │ └── ToL-RohanBarsagade-Kinfin_Integration │ ├── media │ ├── flowchart.png │ ├── outdated_py2_syntax.png │ └── util_classes_functions.png │ └── ToL-RohanBarsagade-Kinfin_Integration.md ├── GSoC-2015 ├── Accepted │ ├── Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy │ │ ├── images │ │ │ ├── image00.png │ │ │ ├── image01.png │ │ │ ├── image02.png │ │ │ ├── image03.png │ │ │ ├── image04.png │ │ │ ├── image05.png │ │ │ ├── image06.png │ │ │ ├── image07.png │ │ │ ├── image08.png │ │ │ ├── image09.png │ │ │ ├── image10.png │ │ │ ├── image11.png │ │ │ ├── image12.png │ │ │ ├── image13.png │ │ │ ├── image14.png │ │ │ ├── image15.png │ │ │ ├── image16.png │ │ │ └── image17.png │ │ └── Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy.md │ ├── The-Eclipse-Foundation-AnujPahuja-Cloud-Removal │ │ ├── media │ │ │ └── image_0.png │ │ └── GSoC-GeoTrellis.md │ ├── kamdjouduplex-ASF-ODE-Process-Instance-Visualization-For-Monitoring-Console │ │ ├── media │ │ │ └── image_0.png │ │ └── kamdjouduplex-ASF-ODE-Process-Instance-Visualization-For-Monitoring-Console.md │ ├── SaketC-statsmodels-MixedModels │ │ └── SaketC-statsmodels-MixedModels.md │ └── HimanshuMishra-PSF-Implementing-Add-On-System-For-NetworkX │ │ └── GSoC-2015-Application-Himanshu-Mishra-Add-on-System-for-NetworkX.md └── Proposed │ └── Mozilla-ManishG-FormSupportServo │ └── Mozilla-ManishG-FormSupportServo.md ├── GSoC-2018 └── Accepted │ └── CCExtractor-ShivamKumarJha-ProjectNephos │ └── media │ └── media │ ├── image15.jpg │ ├── image22.png │ ├── image23.png │ ├── image25.png │ ├── image26.png │ ├── image27.png │ ├── image28.png │ ├── image29.png │ ├── image30.png │ ├── image31.png │ ├── image32.png │ ├── image33.png │ ├── image36.png │ ├── image37.png │ ├── image38.png │ ├── image39.png │ ├── image40.png │ ├── image41.png │ ├── image42.png │ ├── image43.png │ ├── image44.png │ ├── image45.png │ └── image46.png ├── GSoC-2013 └── Accepted │ └── Genome-Informatics-SaketChoudhary-eQTL-Pipeline │ └── media │ ├── image_0.png │ └── image_1.jpg ├── GSoC-2022 └── Accepted │ └── CCExtractor-DongyangZheng-Introduce-WebSocket-into-rTorrent │ ├── media │ ├── RPC workflow.png │ └── class design.png │ └── DongyangZheng-Introduce WebSocket into rTorrent.md ├── AUTHORS.md ├── walker.py ├── LICENSE ├── README.md └── GSoC-2012 └── Accepted └── Connexions-SaketChoudhary-OERPub-API └── Connexions-SaketChoudhary-OERPub-API.md /GSoC-2021/OBF-TanishqGupta-GtfBase/media/image1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2021/OBF-TanishqGupta-GtfBase/media/image1.png -------------------------------------------------------------------------------- /GSoC-2021/OBF-TanishqGupta-GtfBase/media/image11.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2021/OBF-TanishqGupta-GtfBase/media/image11.png -------------------------------------------------------------------------------- /GSoC-2021/OBF-TanishqGupta-GtfBase/media/image12.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2021/OBF-TanishqGupta-GtfBase/media/image12.png -------------------------------------------------------------------------------- /GSoC-2021/OBF-TanishqGupta-GtfBase/media/image14.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2021/OBF-TanishqGupta-GtfBase/media/image14.png -------------------------------------------------------------------------------- /GSoC-2021/OBF-TanishqGupta-GtfBase/media/image2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2021/OBF-TanishqGupta-GtfBase/media/image2.png -------------------------------------------------------------------------------- /GSoC-2021/OBF-TanishqGupta-GtfBase/media/image3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2021/OBF-TanishqGupta-GtfBase/media/image3.png -------------------------------------------------------------------------------- /GSoC-2021/OBF-TanishqGupta-GtfBase/media/image4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2021/OBF-TanishqGupta-GtfBase/media/image4.png -------------------------------------------------------------------------------- /GSoC-2021/OBF-TanishqGupta-GtfBase/media/image5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2021/OBF-TanishqGupta-GtfBase/media/image5.png -------------------------------------------------------------------------------- /GSoC-2021/OBF-TanishqGupta-GtfBase/media/image6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2021/OBF-TanishqGupta-GtfBase/media/image6.png -------------------------------------------------------------------------------- /GSoC-2021/OBF-TanishqGupta-GtfBase/media/image7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2021/OBF-TanishqGupta-GtfBase/media/image7.png -------------------------------------------------------------------------------- /GSoC-2021/OBF-TanishqGupta-GtfBase/media/image8.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2021/OBF-TanishqGupta-GtfBase/media/image8.png -------------------------------------------------------------------------------- /GSoC-2021/OBF-TanishqGupta-GtfBase/media/image9.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2021/OBF-TanishqGupta-GtfBase/media/image9.png -------------------------------------------------------------------------------- /GSoC-2014/Accepted/BioJS-SaketChoudhary-HumanGV/media/image_0.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2014/Accepted/BioJS-SaketChoudhary-HumanGV/media/image_0.png -------------------------------------------------------------------------------- /GSoC-2014/Accepted/BioJS-SaketChoudhary-HumanGV/media/image_1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2014/Accepted/BioJS-SaketChoudhary-HumanGV/media/image_1.png -------------------------------------------------------------------------------- /GSoC-2014/Accepted/BioJS-SaketChoudhary-HumanGV/media/image_2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2014/Accepted/BioJS-SaketChoudhary-HumanGV/media/image_2.png -------------------------------------------------------------------------------- /GSoC-2014/Accepted/BioJS-SaketChoudhary-HumanGV/media/image_3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2014/Accepted/BioJS-SaketChoudhary-HumanGV/media/image_3.png -------------------------------------------------------------------------------- /GSoC-2014/Accepted/BioJS-SaketChoudhary-HumanGV/media/image_4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2014/Accepted/BioJS-SaketChoudhary-HumanGV/media/image_4.png -------------------------------------------------------------------------------- /GSoC-2023/Accepted/PostgreSQL-RajivHarlalka-pg_statviz/media/ss.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2023/Accepted/PostgreSQL-RajivHarlalka-pg_statviz/media/ss.png -------------------------------------------------------------------------------- /GSoC-2016/Accepted/MovingBlocks-rzats-Visual-NUI-Editor/media/image_0.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2016/Accepted/MovingBlocks-rzats-Visual-NUI-Editor/media/image_0.png -------------------------------------------------------------------------------- /GSoC-2016/Accepted/MovingBlocks-rzats-Visual-NUI-Editor/media/image_1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2016/Accepted/MovingBlocks-rzats-Visual-NUI-Editor/media/image_1.png -------------------------------------------------------------------------------- /GSoC-2016/Accepted/MovingBlocks-rzats-Visual-NUI-Editor/media/image_2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2016/Accepted/MovingBlocks-rzats-Visual-NUI-Editor/media/image_2.png -------------------------------------------------------------------------------- /GSoC-2016/Accepted/MovingBlocks-rzats-Visual-NUI-Editor/media/image_3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2016/Accepted/MovingBlocks-rzats-Visual-NUI-Editor/media/image_3.png -------------------------------------------------------------------------------- /GSoC-2016/Accepted/MovingBlocks-rzats-Visual-NUI-Editor/media/image_4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2016/Accepted/MovingBlocks-rzats-Visual-NUI-Editor/media/image_4.png -------------------------------------------------------------------------------- /GSoC-2016/Accepted/MovingBlocks-rzats-Visual-NUI-Editor/media/image_5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2016/Accepted/MovingBlocks-rzats-Visual-NUI-Editor/media/image_5.png -------------------------------------------------------------------------------- /GSoC-2016/Accepted/MovingBlocks-rzats-Visual-NUI-Editor/media/image_6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2016/Accepted/MovingBlocks-rzats-Visual-NUI-Editor/media/image_6.png -------------------------------------------------------------------------------- /GSoC-2016/Accepted/MovingBlocks-rzats-Visual-NUI-Editor/media/image_7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2016/Accepted/MovingBlocks-rzats-Visual-NUI-Editor/media/image_7.png -------------------------------------------------------------------------------- /GSoC-2024/accepted/ToL-RohanBarsagade-Kinfin_Integration/media/flowchart.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2024/accepted/ToL-RohanBarsagade-Kinfin_Integration/media/flowchart.png -------------------------------------------------------------------------------- /GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image00.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image00.png -------------------------------------------------------------------------------- /GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image01.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image01.png -------------------------------------------------------------------------------- /GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image02.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image02.png -------------------------------------------------------------------------------- /GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image03.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image03.png -------------------------------------------------------------------------------- /GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image04.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image04.png -------------------------------------------------------------------------------- /GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image05.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image05.png -------------------------------------------------------------------------------- /GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image06.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image06.png -------------------------------------------------------------------------------- /GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image07.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image07.png -------------------------------------------------------------------------------- /GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image08.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image08.png -------------------------------------------------------------------------------- /GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image09.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image09.png -------------------------------------------------------------------------------- /GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image10.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image10.png -------------------------------------------------------------------------------- /GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image11.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image11.png -------------------------------------------------------------------------------- /GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image12.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image12.png -------------------------------------------------------------------------------- /GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image13.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image13.png -------------------------------------------------------------------------------- /GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image14.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image14.png -------------------------------------------------------------------------------- /GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image15.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image15.png -------------------------------------------------------------------------------- /GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image16.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image16.png -------------------------------------------------------------------------------- /GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image17.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/images/image17.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image15.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image15.jpg -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image22.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image22.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image23.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image23.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image25.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image25.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image26.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image26.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image27.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image27.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image28.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image28.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image29.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image29.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image30.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image30.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image31.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image31.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image32.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image32.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image33.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image33.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image36.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image36.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image37.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image37.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image38.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image38.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image39.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image39.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image40.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image40.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image41.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image41.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image42.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image42.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image43.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image43.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image44.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image44.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image45.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image45.png -------------------------------------------------------------------------------- /GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image46.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/media/media/image46.png -------------------------------------------------------------------------------- /GSoC-2013/Accepted/Genome-Informatics-SaketChoudhary-eQTL-Pipeline/media/image_0.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2013/Accepted/Genome-Informatics-SaketChoudhary-eQTL-Pipeline/media/image_0.png -------------------------------------------------------------------------------- /GSoC-2013/Accepted/Genome-Informatics-SaketChoudhary-eQTL-Pipeline/media/image_1.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2013/Accepted/Genome-Informatics-SaketChoudhary-eQTL-Pipeline/media/image_1.jpg -------------------------------------------------------------------------------- /GSoC-2015/Accepted/The-Eclipse-Foundation-AnujPahuja-Cloud-Removal/media/image_0.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2015/Accepted/The-Eclipse-Foundation-AnujPahuja-Cloud-Removal/media/image_0.png -------------------------------------------------------------------------------- /GSoC-2024/accepted/ToL-RohanBarsagade-Kinfin_Integration/media/outdated_py2_syntax.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2024/accepted/ToL-RohanBarsagade-Kinfin_Integration/media/outdated_py2_syntax.png -------------------------------------------------------------------------------- /GSoC-2024/accepted/ToL-RohanBarsagade-Kinfin_Integration/media/util_classes_functions.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2024/accepted/ToL-RohanBarsagade-Kinfin_Integration/media/util_classes_functions.png -------------------------------------------------------------------------------- /GSoC-2022/Accepted/CCExtractor-DongyangZheng-Introduce-WebSocket-into-rTorrent/media/RPC workflow.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2022/Accepted/CCExtractor-DongyangZheng-Introduce-WebSocket-into-rTorrent/media/RPC workflow.png -------------------------------------------------------------------------------- /GSoC-2022/Accepted/CCExtractor-DongyangZheng-Introduce-WebSocket-into-rTorrent/media/class design.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2022/Accepted/CCExtractor-DongyangZheng-Introduce-WebSocket-into-rTorrent/media/class design.png -------------------------------------------------------------------------------- /GSoC-2015/Accepted/kamdjouduplex-ASF-ODE-Process-Instance-Visualization-For-Monitoring-Console/media/image_0.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/saketkc/fos-proposals/HEAD/GSoC-2015/Accepted/kamdjouduplex-ASF-ODE-Process-Instance-Visualization-For-Monitoring-Console/media/image_0.png -------------------------------------------------------------------------------- /AUTHORS.md: -------------------------------------------------------------------------------- 1 | CONTRIBUTORS 2 | ============ 3 | 4 | List of people who have contributed 5 | 6 | [Manish Goregaokar](https://github.com/Manishearth) 7 | 8 | [Anuj Pahuja](https://github.com/alasin) 9 | 10 | [Sumith](https://github.com/Sumith1896) 11 | 12 | [Pratyaksh Sharma](https://github.com/pratyakshs) 13 | 14 | [Himanshu Mishra](https://github.com/OrkoHunter) 15 | 16 | [Kamdjou Duplex](https://github.com/kamdjouduplex) 17 | 18 | [Meet Udeshi](https://github.com/udiboy1209) 19 | 20 | [Meet Pragnesh Shah](https://github.com/meetshah1995) 21 | 22 | [Rostyslav Zatserkovnyi](https://github.com/rzats) 23 | 24 | [Kalpesh Krishna](https://github.com/martiansideofthemoon) 25 | 26 | [OGINO Masanori](https://github.com/omasanori) 27 | 28 | [Harsh Gupta](http://hargup.in) 29 | 30 | [Dongyang Zheng](https://github.com/Young-Flash) -------------------------------------------------------------------------------- /walker.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | import glob 3 | import os 4 | import ntpath 5 | 6 | fdirs = ['GSoC-2012', 'GSoC-2013', 'GSoC-2014', 'GSoC-2015'] 7 | ignored_f = ['AUTHORS.md', 'README.md'] 8 | fdict = {k:{'Accepted':[], 'Proposed':[]} for k in fdirs} 9 | for root, dirs, files in os.walk("."): 10 | for f in files: 11 | if f.endswith(".md") and f not in ignored_f: 12 | path = ntpath.split(root) 13 | linkname = path[1] 14 | path = ntpath.split(path[0]) 15 | gsoc = path[0][2:] 16 | acc_or_pro = path[1] 17 | fdict[gsoc][acc_or_pro].append('[{}]({}/{})'.format(linkname, root[2:], f)) 18 | 19 | strtext = '' 20 | for key in fdirs: 21 | value = fdict[key] 22 | strtext += '- {}\n'.format(key) 23 | for a, flist in value.iteritems(): 24 | if not flist: 25 | continue 26 | strtext += '\t- {}\n'.format(a) 27 | for fl in flist: 28 | strtext += '\t\t- {}\n'.format(fl) 29 | 30 | print strtext 31 | 32 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | The MIT License (MIT) 2 | 3 | Copyright (c) 2015 Saket Choudhary 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | 23 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # FOS Proposals 2 | ## Express your love for Open Source! 3 | 4 | This repository serves as an archive of *F*ree and *O*pen *S*ource proposals. 5 | Since [GSoC](https://developers.google.com/open-source/soc/?csw=1) is the only such 6 | program where I participated myself, the dump is organized as such. 7 | If you have proposals from GSoC/KDE-Soc/any, feel free to send a PR. 8 | 9 | 10 | ## Proposals 11 | 12 | - [GSoC-2012](GSoC-2012) 13 | - [Accepted](GSoC-2012/Accepted) 14 | - [Connexions-SaketChoudhary-OERPub-API](GSoC-2012/Accepted/Connexions-SaketChoudhary-OERPub-API/Connexions-SaketChoudhary-OERPub-API.md) 15 | - [GSoC-2013](GSoC-2013) 16 | - [Accepted](GSoC-2013/Accepted) 17 | - [Genome-Informatics-SaketChoudhary-eQTL-Pipeline](GSoC-2013/Accepted/Genome-Informatics-SaketChoudhary-eQTL-Pipeline/Genome-Informatics-SaketChoudhary-eQTL-Pipeline.md) 18 | - [GSoC-2014](GSoC-2014) 19 | - [Accepted](GSoC-2014/Accepted) 20 | - [Mozilla-ManishG-Servo](GSoC-2014/Accepted/Mozilla-ManishG-Servo/Mozilla-ManishG-Servo.md) 21 | - [KDE-AnujPahuja-KDE-Games-to-KF5](GSoC-2014/Accepted/KDE-AnujPahuja-KDE-Games-to-KF5/GSoC-KDE.md) 22 | - [BioJS-SaketChoudhary-HumanGV](GSoC-2014/Accepted/BioJS-SaketChoudhary-HumanGV/BioJS-SaketChoudhary-HumanGV.md) 23 | - [Sympy-HarshGupta-Solvers](GSoC-2014/Accepted/Sympy-HarshGupta-Solvers/Sympy-HarshGupta-Solvers.md) 24 | 25 | - [GSoC-2015](GSoC-2015) 26 | - [Accepted](GSoC-2015/Accepted) 27 | - [HimanshuMishra-PSF-Implementing-Add-On-System-For-NetworkX](GSoC-2015/Accepted/HimanshuMishra-PSF-Implementing-Add-On-System-For-NetworkX/GSoC-2015-Application-Himanshu-Mishra-Add-on-System-for-NetworkX.md) 28 | - [Sumith1896-PSF-Implementing-Polynomial-module-in-CSymPy](GSoC-2015/Accepted/Sumith1896-PSF-Implementing-Polynomial-module-in-CSymPy/GSoC-2015-Application-Sumith-Implementing-polynomial-module-in-CSymPy.md) 29 | - [Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy](GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy.md) 30 | - [SaketC-statsmodels-MixedModels](GSoC-2015/Accepted/SaketC-statsmodels-MixedModels/SaketC-statsmodels-MixedModels.md) 31 | - [The-Eclipse-Foundation-AnujPahuja-Cloud-Removal](GSoC-2015/Accepted/The-Eclipse-Foundation-AnujPahuja-Cloud-Removal/GSoC-GeoTrellis.md) 32 | - [ASF-ODE-Process-Instance-Visualization-For-Monitoring-Console](GSoC-2015/Accepted/kamdjouduplex-ASF-ODE-Process-Instance-Visualization-For-Monitoring-Console/kamdjouduplex-ASF-ODE-Process-Instance-Visualization-For-Monitoring-Console.md) 33 | - [Proposed](GSoC-2015/Proposed) 34 | - [Mozilla-ManishG-FormSupportServo](GSoC-2015/Proposed/Mozilla-ManishG-FormSupportServo/Mozilla-ManishG-FormSupportServo.md) 35 | - [GSoC-2016](GSoC-2016) 36 | - [Accepted](GSoC-2016/Accepted) 37 | - [PSF-MeetPragneshShah-RISCV-myHDL](GSoC-2016/Accepted/PSF-MeetPragneshShah-RISCV-myHDL/PSF-MeetPragneshShah-RISCV-myhdl.md) 38 | - [PSF-Kivy-udiboy1209-Tiled-Integration-With-KivEnt](GSoC-2016/Accepted/PSF-Kivy-udiboy1209-Tiled-Integration-With-KivEnt/PSF-Kivy-udiboy1209-Tiled-Integration-With-KivEnt.md) 39 | - [MovingBlocks-rzats-Visual-NUI-Editor](GSoC-2016/Accepted/MovingBlocks-rzats-Visual-NUI-Editor/MovingBlocks-rzats-Visual-NUI-Editor.md) 40 | - [Mozilla-KalpeshKrishna-Schedule-TaskCluster-Jobs](GSoC-2016/Accepted/Mozilla-KalpeshKrishna-Schedule-TaskCluster-Jobs/Mozilla-KalpeshKrishna-Schedule-TaskCluster-Jobs.md) 41 | - [lowRISC-omasanori-Porting-musl-libc-to-RISC-V](GSoC-2016/Accepted/lowRISC-omasanori-Porting-musl-libc-to-RISC-V/lowRISC-omasanori-Porting-musl-libc-to-RISC-V.md) 42 | - [OBF-VivekRai-Linking_Phenotypes_with_SNP](GSoC-2016/Accepted/OBF-VivekRai-Linking_Phenotypes_with_SNP/OBF-VivekRai-Linking_Phenotypes_with_SNP.md) 43 | - [GSoC-2018](GSoC-2018) 44 | - [Accepted](GSoC-2018/Accepted) 45 | - [CCExtractor-ShivamKumarJha-ProjectNephos](GSoC-2018/Accepted/CCExtractor-ShivamKumarJha-ProjectNephos/CCExtractor-ShivamKumarJha-ProjectNephos.md) 46 | 47 | - [GSoC-2021](GSoC-2021) 48 | - [GSoC-2021/OBF-TanishqGupta-GtfBase](GSoC-2021/OBF-TanishqGupta-GtfBase/OBF-TanishqGupta-GtfBase.md) 49 | 50 | - [GSoC-2022](GSoC-2022) 51 | - [GSoC-2022/Introduce WebSocket into rTorrent](GSoC-2022/Accepted/CCExtractor-DongyangZheng-Introduce-WebSocket-into-rTorrent/DongyangZheng-Introduce%20WebSocket%20into%20rTorrent.md) 52 | 53 | - [GSoC-2023](GSoC-2023) 54 | - [GSoC-2023/pg_statviz: Time Series Analysis & Visualization of Statistics](./GSoC-2023/Accepted/PostgreSQL-RajivHarlalka-pg_statviz/PostgreSQL-RajivHarlalka-pg_statviz.md) 55 | - [GSoC-2023/Postgres Extension Tutorial/Quick Start](./GSoC-2023/Accepted/PostgreSQL-IshaanAdarsh-postgres-extension-tutorial/PostgreSQL-IshaanAdarsh-postgres-extension-tutorial.md) 56 | - [GSoC-2023/CNCF-DarshanKumar-build_imgext](./GSoC-2023/Accepted/CNCF-DarshanKumar-build_imgext/CNCF-DarshanKumar-build_imgext.md) 57 | - [GSoC-2024](GSoC-2024) 58 | - [GSoC-2024/Rohan-Kinfin-Integration](GSoC-2024/accepted/ToL-RohanBarsagade-Kinfin_Integration/ToL-RohanBarsagade-Kinfin_Integration.md) 59 | 60 | 61 | 62 | 63 | 64 | `Proposed` is euphemistic. 65 | *Love can often be one-sided.* 66 | 67 | ## Adding your proposal? 68 | 69 | Opening a pull request with links to a Google Doc should suffice. 70 | In case you want to help more, read on. 71 | 72 | 73 | All proposals need to be in Github flavored markdown format. 74 | If you have a Google Doc, conversion should be pretty straightforward 75 | using [pandoc](http://johnmacfarlane.net/pandoc/README.html) 76 | 77 | 78 | 79 | ``` 80 | git clone git@github.com:username/fos-proposals.git 81 | git checkout -b username-year 82 | cd GSoC-year 83 | mkdir Organisation-YourName-Title 84 | cd Organisation-yourname-title && mkdir media 85 | ``` 86 | 87 | All images should go to `media`, and be relatively linked in your markdown. 88 | To port a `doc/docx` proposal: 89 | 90 | ``` 91 | cd Organisation-YourName-Title 92 | pandoc -r docx infile.docx -w markdown_github -o Organisation-YourName-Title.md --extract-media=media 93 | rm infile.docx 94 | git add . 95 | git commit -am “GSoC-Year: Proposal Title” 96 | git push origin username-year 97 | ``` 98 | 99 | Alternatively, you can use [gdocs2md](https://github.com/mangini/gdocs2md) 100 | which in my experience so far gives superior output to `pandoc`. 101 | Suggestions are always welcome. 102 | 103 | 104 | # Similar Projects 105 | 106 | - [SDSLabs GSoC](https://blog.sdslabs.co/gsoc/) 107 | - [GSoC-2017-Accepted-Proposals](https://github.com/saurabhshri/GSoC-2017-Accepted-Proposals) 108 | 109 | # FAQs 110 | 111 | - [GSoC-FAQs](https://github.com/OrkoHunter/gsoc-FAQs) 112 | -------------------------------------------------------------------------------- /GSoC-2014/Accepted/KDE-AnujPahuja-KDE-Games-to-KF5/GSoC-KDE.md: -------------------------------------------------------------------------------- 1 | **Name: **Anuj Pahuja 2 | 3 | **Freenode IRC Nick: **alasin 4 | 5 | **IM Service and Username: **GTalk - kamikazeanuj@gmail.com 6 | 7 | **Location: **Gurgaon, India (GMT + 0530 hrs) 8 | 9 | **Project Title: **Porting KDE Games to KDE Frameworks 5 10 | 11 | **Motivation for Proposal:** 12 | 13 | As a part of the porting process of KDE apps to newly developed KDE Frameworks 5, I would like to port the KDE Games module to KF5. The main motivation behind the project is to keep KDE Games bit rot-free and maintained with respect to the latest libraries. An initial port of libkdegames to KF5 has been made by Albert Astals Cid but needs to be updated and completed. Also being untested, the library requires some games to be ported to KF5. The project aims at completely porting libkdegames to KF5 followed by a KF5 port of three games from different categories. 14 | 15 | **Project Goals:** 16 | 17 | **List of Deliverables:** 18 | 19 | * KDE Games module (libkdegames) completely ported to KDE Frameworks 5. 20 | 21 | * KBounce (Arcade), Naval Battle (Board) and KMines(Logic) ported to KF5/Qt5/Qt Quick 2. 22 | 23 | 24 | 25 | **Investigative work:** 26 | 27 | * Dropping QPainter API in favour of QSceneGraph API for rendering game graphics resulting in some performance benefits. 28 | 29 | **Implementation:** 30 | 31 | As a part of porting process to KF5/Qt5, it would be a better approach to first port the games to kde4support and then drop kde4support dependencies in favor of other KF5 methods. As a port of libkdegames has already been made by Albert, porting libkdegames would require much less effort. KBounce, Naval Battle and KMines on the other hand would require porting from scratch. The essential steps can be broadly categorized as follows: 32 | 33 | * Removal of Qt3Support. 34 | 35 | * Making corresponding changes in the build system. 36 | 37 | * Modifying the includes and headers. 38 | 39 | * Making core macro changes (‘slots -> Q_SLOTS’ etc.) and other platform specific macros (‘Q_WS_*’ -> Q_OS_*’ etc.) 40 | 41 | * Removing wrapped macros for Plugin loading and unit tests. 42 | 43 | * Changing method signatures. 44 | 45 | * Porting all instances of Qt Quick 1 code to Qt Quick 2/QML2. 46 | 47 | * Replacing deprecated methods which have been moved to kde4support by native Qt5 methods. 48 | 49 | **Tentative Timeline:** 50 | 51 | **22 April - 19 May: (Community Bonding Period)** 52 | 53 | * Familiarize myself with the codebase of KDE games and the common libraries. 54 | 55 | * Read through the existing documentation of KDE Frameworks 5 and the important changes that have been made since the previous version. 56 | 57 | * Discuss the flow of action and finalize a workplan with the mentor and other developers. 58 | 59 | **19 May - 2 June: 2 weeks (Porting libkdegames)** 60 | 61 | * Make changes in the build system of libkdegames. 62 | 63 | * Remove kdelibs dependency by using Cmake modules from ‘extra-cmake-modules- package’. 64 | 65 | * Make changes in UI by replacing deprecated kdelibs methods like KDialog and KAction. 66 | 67 | **2 June - 25 July: ~ 7.5 weeks (Porting KBounce, Naval Battle and KMines)** 68 | 69 | * Remove any Qt3 support dependencies. 70 | 71 | * Make core functional changes from KStandardDirs, KComponentData etc. to corresponding Qt5 counterparts QStandardPaths, QCoreApplication etc. 72 | 73 | * Replace deprecated KDE 4 methods like KConfig, KSaveFile, KConfig etc. by the corresponding Qt5/KF5 methods like QSaveFile, QCoreApplication etc. 74 | 75 | * Make UI changes by modifying how notifications, fonts, pixel mapping etc. work as per the KF5/Qt5 guidelines. This would include changes in KAction, KDialog, KPixmap, KStyle etc. 76 | 77 | * Make IO changes which will involve removing deprecated KIO::Job methods including showErrorDialog() and isInteractive() and porting KFileItem API to QMimeType. 78 | 79 | * Change KFile, KParts and ItemViews functions. 80 | 81 | * Port QML/Qt Quick 1 to QML/Qt Quick 2 wherever needed. 82 | 83 | 84 | 85 | Further breakdown of timeline would be: 86 | 87 | * Port Kbounce (Arcade) - 2 June - 20 June (~2.5 weeks ) 88 | 89 | * Port Naval Battle (Board) - 20 June - 7 July (~2.5 weeks) 90 | 91 | * Port KMine (Logic) - 7 July - 25 July (~2.5 weeks) 92 | 93 | **25 July - 11 August: ~2.5 weeks (Investigative period)** 94 | 95 | * Replace uses of the QPainter API to the QSceneGraph API wherever possible. This would be done if time persists and things go as per the timeline. 96 | 97 | * Port another game to KF5. 98 | 99 | **11 August - 18 August: (Pencils Down)** 100 | 101 | * Final testing of the ported games. 102 | 103 | * Writing the necessary documentation and porting notes for future developers. 104 | 105 | * Cleaning up the code and removing bugs (if any). 106 | 107 | **Other Commitments:** 108 | 109 | I have my semester exams from May 3 - May 17, which falls in the community bonding period during which I might not be able to spend much time reading documentation and source codes. I will spend extra time before my exams to compensate for that. During the coding period, I have no other commitments and will be able to work 40-45 hours per week except for the last week. My college will open from August 2, after which I will be able to spend around 30 hours a week. Having successfully participated in the Season of KDE before, I am confident that I will be able to manage my time efficiently and devote as much time as required to complete the project. 110 | 111 | **About me:** 112 | 113 | I am a third year undergraduate student pursuing my Bachelors in Computer Science from BITS Pilani. Apart from having lots of interest in computing in general, I have been an open source enthusiast from past 2 years, creating awareness in and around our college about FOSS. 114 | 115 | I started using KDE just 6 months after my introduction to Ubuntu and haven’t switched to any other distro after that. The range of applications and the great community support encouraged me to contribute to KDE. Consequently, I participated in Season of KDE 2013 and successfully completed my project on implementing new features in KBounce with the help of my mentor Roney Gomes *. During the project, I also went through the sourcecode of other KDE Games to find similar features which helped me understand the generic layout of the KDE Games module. 116 | 117 | Being a big fan of KDE Games and having worked on its codebase, I believe I am well-suited for this project and looking forward to work on it. I would start working on porting other games as and when the project gets over. 118 | 119 |  120 | 121 | **Link to Patches:** 122 | 123 | The changes and patches me and Roney made are in the keyboard-integration branch of Kbounce. Here are the links: 124 | 125 | [1]https://projects.kde.org/projects/kde/kdegames/kbounce/repository/show?rev=keyboard-integration 126 | 127 | [2]https://projects.kde.org/projects/kde/kdegames/kbounce/repository/revisions/keyboard-integration/revisions 128 | 129 | -------------------------------------------------------------------------------- /GSoC-2015/Accepted/The-Eclipse-Foundation-AnujPahuja-Cloud-Removal/GSoC-GeoTrellis.md: -------------------------------------------------------------------------------- 1 | **Name: **Anuj Pahuja 2 | 3 | **IRC handle: **alasin 4 | 5 | **Email -** [kamikazeanuj@gmail.com](mailto:kamikazeanuj@gmail.com) 6 | 7 | **Location: **Gurgaon, India (GMT + 0530 hrs) 8 | 9 | **Project Title: **Cloud removal from Satellite Imagery in GeoTrellis 10 | 11 | **Motivation for the proposal:** 12 | 13 | Having worked on multiple Image processing and Computer Vision projects like object tracking, image retrieval and video stabilization, the idea of combining computer vision with a functional language is still new to me and hence, all the more interesting. The eagerness to learn and work on something as cool as this, coupled with my interest in Computer Vision encouraged me to work on this project and contribute to GeoTrellis. Even after the project gets over, I plan to implement other CV algorithms like image segmentation for satellite imagery in GeoTrellis. 14 | 15 | **Abstract:** 16 | 17 | Clouds and cloud shadows often obscure parts of images acquired by satellites or other space-borne sensors. However, useful image data could be extracted from other relatively non-cloudy images of the same geographical location that can be used to ‘fill’ these obscure parts. Multiple images can be captured of an area of interest over time, such as Landsat images. When combined together with the right algorithms, these images can offer a cloud-free (or nearly cloud-free) view of that area. GeoTrellis is already great at distributed raster processing on a cluster. If it were enabled with the power to remove clouds from a large set of imagery, it would enable users to run that process over any amount of imagery, in a scalable way. 18 | 19 | **Implementation Details:** 20 | 21 | This project will assume that the imagery to be operated on is already stored inside a GeoTrellis system, and can be retrieved by GeoTrellis as a RasterRDD. The retrieval of imagery will be handled by the GeoTrellis system, and while some functionality might be required to correctly produce a set of overlapping imagery to be processed, this project will focus on the cloud removal processing of imagery that is provided by this method. 22 | 23 | The cloud removal functionality will be implemented in broadly three steps: 24 | 25 | Cloud detection using Fmask[1]: 26 | 27 | Fmask algorithm developed by Zhu et al.[1] will be implemented to detect clouds and cloud shadows in the input images and generate corresponding cloud masks for further processing. The algorithm would take in the Top of Atmosphere (TOA) reflectances for Bands 1, 2, 3, 4, 5, 7 and Band 6 Brightness Temperature (BT) of the Landsat images as the inputs. Then, rules based on cloud and cloud shadow physical properties would be used to extract a potential cloud layer and a potential cloud shadow layer. This would mainly involve spectral tests on the images to identify Potential Cloud PIxels (PCPs). Some of these tests are: 28 | 29 | * Spectral tests on the images to identify Potential Cloud Pixels (PCPs). 30 | 31 | * Setting thresholds for NDSI (Normalized Difference Snow Index) and NDVI (Normalized Difference Vegetation Index). 32 | 33 | * Performing a Whiteness test for cloud pixel identification. 34 | 35 | * Performing a Haze Optimization Transformation (HOT) test. 36 | 37 | * Performing a ‘Water Test’ to separate water and land pixels. 38 | 39 | Based on these tests, cloud probability for all pixels will be calculated and a potential cloud layer and cloud shadow layer would be extracted. Finally, the segmented potential cloud layer and the geometric relationships would be used to match it with the potential cloud shadow layer, leading to the production of the final cloud and 40 | 41 | cloud shadow mask which will be saved as another image. 42 | 43 | Processing the cloud masks and base images: 44 | 45 | The image containing the cloud mask would be binarized so as to separate the actual cloud and cloud shadow from other pixels. The mask image would further be processed depending upon the absence of data in the original image. Depending on the position of cloud regions in the *fill *images, or the less cloudy images, a composite labeled cloud mask would be formed labeling pixels as background pixels, clouded pixels in base image and clouded pixels in *fill* images. 46 | 47 | Filling Cloudy regions: 48 | 49 | For filling the cloudy regions, an NSPI cloud fill algorithm by Zhu et al.[2] will be implemented. The algorithm uses a weighted linear model to predict the spectral value of a target pixel from its neighbouring similar pixels. Here is the flowchart describing this approach[2]: 50 | 51 | ![image alt text](media/image_0.png) 52 | 53 | It’s important to note that the Fmask algorithm described above is limited to Landsat/ETM+ images and thus, restricts the scope of this algorithm. However, the project is in service of implementing cloud removal algorithms for multiband imagery in general, and to make improvements in the framework that allow general algorithms to run over imagery data. Implementation of the Fmask algorithm will be an initial step towards those broader framework improvements. The project aims to be an initial step towards cloud removal for satellite imagery in general. 54 | 55 | [1] Zhu, Z. and Woodcock, C. E., Object-based cloud and cloud shadow detection in Landsat imagery, Remote Sensing of Environment (2012). 56 | 57 | [2] Zhu, X., Gao, F., Liu, D., Chen, J., 2012. A modified neighborhood similar pixel interpolator approach for removing thick clouds in Landsat images. Geoscience and Remote Sensing Letters, IEEE 9, 521–525. 58 | 59 | **Tentative Timeline:** 60 | 61 | 28 April - 24 May (Community Bonding Period): 62 | 63 | * Setting up the development environment. 64 | 65 | * Getting familiar with GeoTrellis codebase and its API use. 66 | 67 | * Learning more about Scala and its properties. 68 | 69 | * Getting the test dataset and software ready for the project. 70 | 71 | * Discussing the flow of action and finalizing a workplan with the mentor. 72 | 73 | 25 May - 8 June (2 weeks): 74 | 75 | * Implementing the first pass of extracting potential cloud layer which comprises of doing a Cloud Test, Snow Test and Water Test for pixels. 76 | 77 | 9 June - 16 June (1 week): 78 | 79 | * Implementing the second pass of extracting potential cloud layer by computing cloud probability for all the pixels. 80 | 81 | 17 June - 24 June (1 week): 82 | 83 | * Extracting cloud shadow layer from the image. 84 | 85 | 25 June - 9 July (2 weeks): 86 | 87 | * Matching the extracted cloud with its corresponding shadow layer and generating the final mask file. 88 | 89 | 10 July - 24 July (2 weeks): 90 | 91 | * Processing base images and the mask files, labeling them accordingly and generating final mask for the cloud fill algorithm. 92 | 93 | 25 July - 8 August (2 weeks): 94 | 95 | * Implementing the modified NSPI algorithm for filling cloudy regions using the mask obtained above. 96 | 97 | 9 August - 21 August (Pencils Down): 98 | 99 | * Integrating everything with the main software. 100 | 101 | * Testing code for other datasets and solving bugs. 102 | 103 | * Cleaning up code and writing instructions for use. 104 | 105 | 106 | 107 | **Other Commitments:** 108 | 109 | I have no other commitments during the coding period and will be able to work for 40-45 hours per week. Having successfully completed GSoC last year, I am confident that I will be able to manage my time efficiently and devote as much time as required to complete the project. 110 | 111 | **About me:** 112 | 113 | I am a final year undergraduate student pursuing my Bachelors in Computer Science from BITS Pilani. Apart from having lots of interest in computing in general, I have been an open source enthusiast from past 3 years and have participated in GSoC 2014 with KDE and have also mentored students in Season of KDE and Google Code-In. 114 | 115 | I have a keen interest in Image Processing and Computer Vision, having worked on Motion based object tracking, video stabilization, image analysis methods, content-based image retrieval and image deconvolution. Working on Computer Vision with Scala will be a first for me and I hope the experience will keep getting better. 116 | 117 | My Github profile [here](https://github.com/alasin/). 118 | 119 | My LinkedIn profile [here](http://in.linkedin.com/in/anujpahuja). 120 | 121 | -------------------------------------------------------------------------------- /GSoC-2022/Accepted/CCExtractor-DongyangZheng-Introduce-WebSocket-into-rTorrent/DongyangZheng-Introduce WebSocket into rTorrent.md: -------------------------------------------------------------------------------- 1 | **Personal Information** 2 | ===================== 3 | - Name: Dongyang Zheng 4 | - Email: flash12561256@gmail.com 5 | - Github: [Young-Flash](https://github.com/Young-Flash) 6 | - Blog: [blog](https://young-flash.github.io/) 7 | - Location: Guangzhou, China 8 | - Time Zone: UTC+8 9 | - University: South China Normal University 10 | - Major: Computer Science And Technology 11 | 12 | **Project Description** 13 | ===================== 14 | - Name: [Implement a modern RPC interface for rTorrent](https://ccextractor.org/public/gsoc/rtorrent-modern-rpc/) 15 | - Mentor: [@jesec](https://github.com/jesec) 16 | - Goal: 17 | - **modernization**: implement JSON-RPC over WebSockets support on top of the 18 | existing JSON-RPC over SCGI (raw socket) implementation, hook event emitting 19 | function and send event to WebSocket subscribers. 20 | - **performance**: consider replacing the global lock with a shared lock. 21 | - **security**: TLS support. 22 | 23 | 24 | **About me** 25 | ===================== 26 | My major programming language is C++ and I really like it, I understand the syntax of C++, new features in modern C++, some knowledge about CMake and network programming over 27 | TCP/IP, Websocket. 28 | 29 | I have **open source experience**. I participated in a activity (named Summer 2021) similar to GSoC last summer and successfully passed the review of mentor and organizers. The project I am responsible for is to encapsulate the JDBC protocol for a graph database named Nebula Graph, it was done by me almost independently (of course, with the guidance of mentor), the 30 | output (code and documentation) have been merged into official github repository. Now I am also a volunteer contributor in this open source community and continue to maintain this project. You can see the activity info [here](https://summer-ospp.ac.cn/2021/#/?lang=en) and my outcome can be seen [here](https://github.com/nebula-contrib/nebula-jdbc). I am in my master degree now, majoring computer science and technology in [Social Network Data Intelligence Lab](https://www.scholat.com/team/scholatteam), research area in Knowledge Graph, published a paper as the first author, more info about me can be seen [here](https://www.scholat.com/zhengdongyang). 31 | 32 | **Homework** 33 | ===================== 34 | - Build from source and run rTorrent, make dev environment and dependencies ready on my machine. 35 | - Read source code in src/rpc and other parts related, understand the core concepts and RPC workflow 36 | 37 | 38 | - Study the websocket protocol by [RFC6455](https://www.rfc-editor.org/rfc/rfc6455) and using the websocket framework, make a [quick demo](https://github.com/Young-Flash/websockets-demo) about how to use uWebsockets to implement a server which supports publish and subscribe over websockets protocol (with ssl). 39 | 40 | - Accomplished qualification task posted by mentor, it's [here](https://github.com/Young-Flash/translator). 41 | - I implement a "translator", which takes JSON arguments from Websocket client, reformat it into SCGI and forward it to rTorrent, waiting response from rTorrent and then pass the response to the websocket client. 42 | - The core class design and data flow is shown below: 43 | 44 | 45 | - **Add unix domain socket support for uWebSockets, it's [here](https://github.com/Young-Flash/uWebSockets).** 46 | 47 | - for uSockets: add interface `(LIBUS_SOCKET_DESCRIPTOR bsd_create_listen_socket_unix(const char *path, int options))` which takes the path to socket file as param to setup a listen socket. 48 | 49 | - for uWebSockets: add interface `(TemplatedApp &&listen_unix(std::string path, int options, MoveOnlyFunction &&handler))` to support setup websocket server listening on socket file and set the permission of the socket file to 700 (S_IRWXU) defaultly 50 | 51 | **Plan** 52 | ===================== 53 | 54 | - Steps 55 | 1. **for modernization**: 56 | 57 | introduce uWebsockets into the rTorrent to make client can communicate with rTorrent via websocket protocol, client can subscribe some specific topics, once the event occurs, server will push the notification to client automatically; implement some callback function and bind to the events we care about, execute them when event occurs. 58 | 59 | 2. **for security**: 60 | 61 | upgrade ws (websocket) to wss (websocket with TLS), specifies the path of ssl credential in the config file of rTorrent, when the program launch, load the credential to init a wss connection 62 | 3. **for performance**: 63 | 64 | it is about the global lock (a static member (std::mutex) in the class torrent::thread_base), the global lock is not used to interrupt the RPC routines rather the "main_thread", maybe need to modify the code in libtorrent (kind of tricky, leave it for future consideration). 65 | 66 | - 3rd party libraries: [uWebSockets](https://github.com/uNetworking/uWebSockets) (Apache License 2.0) or [libwebsockets](https://github.com/warmcat/libwebsockets) (MIT license) 67 | 68 | - Support I need from community:: discuss with mentor in case of query. 69 | 70 | **Timeline** 71 | ===================== 72 | 73 | ## official overview 74 | 75 | - **May 20**: accepted GSoC contributor projects announced; 76 | - **May 21: - June 12**: community Bonding Period | GSoC contributors get to know mentors, read documentation, get up to speed to begin working on their projects; 77 | - **June 13 - July 25**: coding officially begins; 78 | - **July 26 - September 4**: work Period | GSoC contributors work on their project with guidance from Mentors; 79 | - **September 5 - September 12**: final week. 80 | 81 | ## details 82 | 83 | - **before May 20** 84 | - communicate with mentor to understand and details the goal of project; 85 | - get familiar with the websocket protocol, make a demo [here](https://github.com/Young-Flash/websockets-demo); 86 | - get the [qualification task](https://github.com/Young-Flash/translator) done. 87 | 88 | - **May 21 - June 12** 89 | - clarify the RPC call process in details, replace SCGI routine with websocket in a thread; first make it work under tcp ip:port, then consider to add unix domain socket support 90 | - test the connection between rTorrent and websocket client (by [Postman](https://www.postman.com/)) 91 | 92 | - **June 13 - July 25** 93 | - implement "server push" capacity: design some callback function (notify client) to execute automatically when events happens (download state changes, etc), **current idea**: when the state of a torrent changes, `DL_TRIGGER_EVENT(download, event_name)` will be called, so we can hook event emitting here according to the parameter "event_name". 94 | - Take `event.download.finished` as example, when this event happens, websocket routine will send message below to client: 95 | ```json 96 | { 97 | "id":null, 98 | "jsonrpc":"2.0", 99 | "result":[ 100 | { 101 | "torrent_hash_value":"F243DA99EB8A210A5B8AC480878B4564DF6BE221", 102 | "state":"finished", 103 | "time_stamp":1649656905 104 | } 105 | ] 106 | } 107 | ``` 108 | - prepare for phase 1 evaluation 109 | 110 | - **July 26 - August 4** 111 | - buffer week for any unpredictable problems 112 | 113 | - **August 5 - September 4** 114 | - consider to add unix domain socket support for uWebsockets / uSockets, this may challenging as it not merely to set `AF_LOCAL` when socket setup, but need to consider both configuration and API compatibility. This work achieved initial progress, I will try to make a [PR](https://github.com/uNetworking/uWebSockets/pull/1486) to uWebSockets, other features should be add such as make it compiles on Windows, client connection function and benchmark, etc 115 | - try to reduce the granularity of lock in libtorrent; 116 | - prepare for the final evaluation 117 | 118 | - **September 5 - September 12** 119 | - code style and quality review, dev documentation 120 | - prepare for final evaluation 121 | 122 | **Why me** 123 | ===================== 124 | I like modern C++ and the project was written in C++ 17, I also interested in network programming and the project requires websocket and RPC. I think my technology stack matches the requirements of this project. When I find this project and make some research on it (build and run rTorrent and read the source code of it, communicate with mentor about project requirements and details), I make a decision that all in this one and leave others for no consideration, this is also my usual style of doing things, making decisions after research and focusing on it attentively. 125 | 126 | I love the way open source works, and participating in open Source activities has introduced me to things I'd never known before, improved my coding skills and met a lot of interesting people who I've enjoyed communicating with, and I hope to contribute more to the open source world. I first got involved with open source at Summer 2021 (mentioned above) last year, but my open source experience didn't end with the end of the activity, now I am a volunteer contributor in the Nebula Graph community, maintaining on the projects I was previously working on. If I'm lucky enough to be selected for this project, I'll take it seriously, keep contributing to the community by continuing to maintain it in the future. Although GSoC 2022 may be over, my enthusiasm for open source isn't. -------------------------------------------------------------------------------- /GSoC-2012/Accepted/Connexions-SaketChoudhary-OERPub-API/Connexions-SaketChoudhary-OERPub-API.md: -------------------------------------------------------------------------------- 1 | ### Connexions Google Summer Of Code Student Application 2 | 3 | Name: Saket Kumar Choudhary 4 | 5 | Street Address: A 19 Hastinapur , Anushakti Nagar 6 | 7 | City: Mumbai State: Maharashtra Postal/Zip Code: 400094 8 | 9 | Email Address: saketkc@gmail.com 10 | 11 | Phone: +91-9869649197 12 | 13 | University: Indian Institute of Technology Bombay,Mumbai 14 | 15 | Connexions Project You Are Interested In:Slide Importer client using the OERPub publishing API 16 | 17 | 1. **List your programming skills as related to the project description** 18 | 19 | I am a 20 year old developer,a FOSS contributor/supporter and a Python & Ruby enthusiast. I am currently pursuing my undergraduate studies at Chemical Engineering Department of Indian Institute of Technology Bombay,India. My topics of special interest are WebFrameworks , Algorithms and Software-Design. 20 | 21 | I have been programming for three years now . I started out with C/C++ in my first year of undergraduate studies and moved ahead to try my hand at PHP, Python and Ruby. I am comfortable with MVC Frameworks(CakePHP,Django,RAILS) and have also worked with microwebframeworks like Sinatra and Flask. 22 | 23 | 2. **Have you worked on an open source project before? If so, give the project name and describe the work you did.** 24 | 25 | I have submitted few pathces as listed down here:[ http://home.iitb.ac.in/~saket.kumar/bugsandpatches.html](http://home.iitb.ac.in/~saket.kumar/bugsandpatches.html) 26 | 27 | Some of my Open Source Projects so Far have been : 28 | 29 | 1. **Scilab on Cloud** 30 | 31 | [[https://github.com/saketkc/scilab_cloud](https://github.com/saketkc/scilab_cloud)]: 32 | 33 | I developed a Python based application that would enable runnign Scilab[[http://scilab.org/](http://scilab.org/)] a scientific computing software. This app allows user to run his/her Scilab codes online through a browser, thus removing the need to install Scilab Client locally.I soleley developed thi app on my own 34 | 35 | 2. ** IIT Bombay Grading System on SMS** 36 | 37 | **[**[https://github.com/saketkc/iitb-library-sms-interface](https://github.com/saketkc/iitb-library-sms-interface)**] **In a team of 4 I developed a Flask[[http://flask.pocoo.org/](http://flask.pocoo.org/)] based app to scrape through our institute’s Grading interface and send the Grades to a user on his mobile on request. I wrote the Scraping function making use of Beautiful Soup and urlib2 libraries. 38 | 39 | 3**. Pivotal Tracker Email Bot** 40 | 41 | [[https://github.com/saketkc/pivotal-tracker-email-wrapper](https://github.com/saketkc/pivotal-tracker-email-wrapper)] : 42 | 43 | Pivotal Tracker(http://pivotaltracker.com) is an online Agile project Management tool. I sloely developed an interface using their API to create "New Issues" by fetching the emails from a gmail account. This has been recognised as a third party tool by Pivotal :[ ](http://www.pivotaltracker.com/help/thirdpartytools)[http://www.pivotaltracker.com/help/thirdpartytools.](http://www.pivotaltracker.com/help/thirdpartytools.)[ ](http://www.pivotaltracker.com/help/thirdpartytools) 44 | 45 | 4. **DropBox on SMS** 46 | 47 | [[https://github.com/saketkc/dropbox_on_sms](https://github.com/saketkc/dropbox_on_sms)]: 48 | 49 | This "Sinatra"[[http://www.sinatrarb.com/](http://www.sinatrarb.com/)] based app that lets one send a file from your dropbox folder to any email id , just with one SMS. This app stood among the Top 20 at Yahoo! Open Hack India [ September 2011]. I was the sole developer of this complete app. 50 | 51 | 5. **kicad-ngspice** 52 | 53 | [[https://github.com/saketkc/kicad-ngspice](https://github.com/saketkc/kicad-ngspice)] 54 | 55 | Imeplented Python Code to convert KICAD generated netlist to ngspice friendly format, the original netlist could not be exported directly to ngspice due to improper node configurations . 56 | 57 | Also made a GUI using PyGTK for making a file editor. 58 | 59 | 6.** SlideShare** 60 | 61 | **[**[https://github.com/saketkc/SlideShare-Instants](https://github.com/saketkc/SlideShare-Instants)**] and [**[https://github.com/saketkc/SlideShare-Desktop-app](https://github.com/saketkc/SlideShare-Desktop-app)**]** 62 | 63 | I had made a few hacks using SlideShare API, one of them made a GUI on the Desktop for being slideshows[was unsuccessful] and the other one would work like Google Instant to fetch slideshows instantly from SLideshare[successfull] 64 | 65 | Some of my other projects can be found at : 66 | 67 | [https://github.com/saketkc/](https://github.com/saketkc/) 68 | 69 | I did my Summer Internship,2011 at SlideShare one of the biggest Ruby on Rails site in the world [a platform for SlideShow uploading and viewing online] wher I designed and implemented Admin Interface to allow super user actions like delete,suspend,view users and the slideshows. I also implemented specs (Behaviour Driven Development(BDD) )for the Ruby code using RSpec.My expeerinces from the same are featured here :[ http://engineering.slideshare.net/2011/09/applying-for-a-slideshare-internship-read-this-first-hand-account/](http://engineering.slideshare.net/2011/09/applying-for-a-slideshare-internship-read-this-first-hand-account/) 70 | 71 | **Bio: **[http://home.iitb.ac.in/~saket.kumar/saket_cv.pdf](http://home.iitb.ac.in/~saket.kumar/saket_cv.pdf) 72 | 73 | 3. What do you hope to gain from participating in this project? 74 | 75 | I would gain an indepth knowledge about writing scalable and modular code in Python . Though I have done a lot pf projects invoving Python and JS , but noe of them have taught me to write scalable code. As this project would actually go live on a Web Server and would be millions of people, not only it should be scalable but thoroughly tested to . 76 | 77 | presentations and related stuff. 78 | 79 | I use online resources myself for my own studies , and such I believe Connection is a great contributor to the society by providing a platform for everyone to refer Classroom notes , presentations and related stuff. Contributing my code towards the betterment of Connexions will be a privilege for me , my way of contributing back to the community 80 | 81 | Besides that here are my own visions for the project.: 82 | 83 | 1. **Conversion Of Hand Written Notes**: 84 | 85 | A lot people using tablets make their notes i nform of Windows Journals or scanned ppaers f original handwritten notes. Right now the Module UPload API doesn’t support PDFs and OCR. A lot of content can be generated of the API can be made open for reading PDFs and Images[ scanned papers]. 86 | 87 | 2. **Using EPubs format for uploading** : 88 | 89 | A lot of people have shifted entirely to epubs for reading and writing notes. The API needs to be extended to allow for EpuB Conversion to say a .doc and then upload it as a ".doc" . Though this is already part of another GSoC project in the form of rendering epubs, uploading is also an essential feature. 90 | 91 | 3. **Generating more content from SlideShare:** 92 | 93 | We can search SlideShare for a particular tag and for every slideshow that is tagged say "science and technology" , fetch the user contact information from slideshare and then send him/her a request form asking to upload his/her content on cnx.org. 94 | 95 | This can be automated in the sense that the user agrees to the CC license by replying to the reequest mail from cnx.org and then the uploading takes place on its own. 96 | 97 | 4. **Add a short outline of how you will approach your selected project. You could include things like a timeline or a design if needed.** 98 | 99 | **Action Plan:** 100 | 101 | * Before May 21st: 102 | 103 | 1. Discuss ideas about introducing Facebook/Twitter connect with the mentors. This would make sharing more easy and more content oriented 104 | 105 | 2. Read and study the Pyramid Framework 106 | 107 | 3. Discuss with the mentors a testing framework/method to be used for testing the functionalities of the website 108 | 109 | 4. Study the existing code of the OERPPub Publishing API 110 | 111 | 5. Try fixing the pre-existing bugs 112 | 113 | * Week 1 May 21st - May 28th: 114 | 115 | 1. Read the Google Docs API and CNXML documentation 116 | 117 | 2. Discuss and set expectations of code – dependencies, packages required , code style, testing framework , and documentation. 118 | 119 | * Week 2 May 29th - June 5th and Week 3 June 6th- June 13th : 120 | 121 | 1. Develop a module for uploading slides into cnx.org with introductory paragraphs 122 | 123 | 2. Write test cases for the module 124 | 125 | * Week 4 June 14th - June 21st 126 | 127 | 1. Use the SlideShare and Google Docs API so that the modules can be made embeddable 128 | 129 | 1. Write test cases for the plugin 130 | 131 | * Week 5 June 22nd - June 29th 132 | 133 | 1. Create the link box to download the original slides 134 | 135 | 2. Using the publishing API, write code to publish the mopdule on cnx.org 136 | 137 | 3. Write test cases for the code 138 | 139 | * Week 6 June 30th - July 6th and Week 7 July 7th - July 14th 140 | 141 | 1. Implement Audio/Video toolbox along with the slide player to go along with the slides 142 | 143 | 2. Implement "Beautiful code “ for the main page so that some of the sample code features up on the main page 144 | 145 | 3. Write tests for beautiful code model 146 | 147 | * Week 8 July 15th- July 22nd 148 | 149 | 1. Implement slide by slide printing option for printing to either .pdfs or .epubs 150 | 151 | 2. Write test cases for printing 152 | 153 | * Week 9 July 23d - July 30 154 | 155 | 1. Implement OCR to allow uploading of images/scanned notes 156 | 157 | 2. Test OCR code 158 | 159 | * Week 10 July 31st - August 6th. 160 | 161 | 1. Implement the sites mobile version 162 | 163 | 2. Test slide rendering in mobile version 164 | 165 | * Week 11th August 7- August 13th 166 | 167 | 1. Document code 168 | 169 | 2. Bug Fixing 170 | 171 | * Week 12 August 14th - August 20 172 | 173 | 1. Test the functionality of the website 174 | 175 | 2. Complete Documentation 176 | 177 | 3. Bug Fixes 178 | 179 | **Obligations** 180 | 181 | **I have no obligations during the summer s and would be able to contribute y full time to GSoC. I will be available full time on GTalk and IRC** 182 | 183 | 184 | 185 | 186 | 187 | -------------------------------------------------------------------------------- /GSoC-2015/Accepted/kamdjouduplex-ASF-ODE-Process-Instance-Visualization-For-Monitoring-Console/kamdjouduplex-ASF-ODE-Process-Instance-Visualization-For-Monitoring-Console.md: -------------------------------------------------------------------------------- 1 | ## Google Summer of Code 2015 Project Proposal 2 | 3 | ## Apache Software Foundation/ODE-1029 4 | 5 | Project Title: Process Instance Visualization for Monitoring Console 6 | 7 | ## 1- Personal Details 8 | 9 | 10 | 11 | **Contact details** 12 | 13 | **Name:** Kamdjou Temfack Duplex Marie 14 | 15 | **Email:** tony14pro@gmail.com 16 | 17 | **Phone**: +237 670274538 18 | 19 | **IRC nickname:** JS_man 20 | 21 | **Blog**: [http://js-man.blogspot.com](http://js-man.blogspot.com) 22 | 23 | **Github:** [https://github.com/kamdjouduplex](https://github.com/kamdjouduplex) 24 | 25 | **Linkedin:** kamdjou temfack 26 | 27 | **About me** 28 | 29 | * I am a final year Software Engineering student at the University Buea,Cameroon. I am passionate and skilled in User Experience Design (UX) using web technologies and languages like JavaScript, AngularJS, HTML5, CSS3, Twitter Bootstrap and Node Package Manager (npm). 30 | 31 | * I have worked on several projects and was a Regional finalist for the Microsoft Imagine Cup 2014 contest, using HTML5 and JavaScript technologies with a core goal to differentiate my projects through added value on simple basic online tools with great User Interfaces and User Experiences. I believe that abstracting the underlying processes in a product from a user so that he can focus on using his/her tools to improve his productivity is not enough; I believe that as a product developer, I am tasked with making the user’s experiences to be pleasurable so that the user’s work is not only easy, but is desirable, loved and memorable. 32 | 33 | * I worked in a team during two hackdays to build the GDG Buea site [http://www.gdgbuea.org](http://www.gdgbuea.org) using (**Polymer, JavaScript, Bootstrap, HTML5 & CSS3**), I also participated in building a realtime event management app for TV fans using AngularJS and Firebase (site:[https://yopisode-event.firebaseapp.com/#/](https://yopisode-event.firebaseapp.com/#/) repo:[https://github.com/kamdjouduplex/yopevent-developments](https://github.com/kamdjouduplex/yopevent-developments)). I have as well built an application in preparation for the upcoming Google Cloud Challenge using Java EE, JavaScript,HTML5,CSS3, Google AppEngine, BlobStore and Datastore. (site:[http://larrytectest.appspot.com/contact](http://larrytectest.appspot.com/contact) repo: [https://github.com/kamdjouduplex/tech-biz](https://github.com/kamdjouduplex/tech-biz)) 34 | 35 | * I have handled a two-day AngularJS meetup in Buea to introduce the University Developer Community to AngularJS 36 | 37 | * I am an active member of the Google Developer Group (GDG) of Buea Cameroon 38 | 39 | **Interests** 40 | 41 | I am always very excited about projects that involves the use of interfaces to create new and better experiences. I love the web and mobile worlds. My love for using interfaces to create lovable abstractions on the web pushed me to love and master HTML5, CSS3, JavaScript (jQuery, AngularJS). I especially love the new possibilities on the browser when HTML5 new features like Canvas play with JavaScript. 42 | 43 | I am very motivated about this project because it will enable me to use these technologies to abstract the complexities of the console with beautiful graphical visualizations. 44 | 45 | ## 2- Personal Availability 46 | 47 | I have reviewed the GSoC 2015 Timeline and I am agreeing that I will be fully available without any constraints during the time mentioned there. Except that I could be off for one week by the end of July for my second semester exams. 48 | 49 | ## 3- Project Abstract 50 | 51 | Business Process Execution Language (BPEL) which is a graphical language for orchestrating web services, being a scalable project. I will be focused on "Process Instance visualization for Monitoring Console" by translating “.bpel” files to “.html” files and to also make sure that the graphical representation of status of different activities of each process can be easily noticed without any effort. So, the main objective is to graphically render a BPEL (Business Process Execution Language) processes so that users can know which activity is already executed completely, which is still pending and which has failed to be executed by being as much as possible consistent with UI components used to get the user familiar with the platform within a short time. 52 | 53 | ## 4- Implementation 54 | 55 | **UI and UX Design** 56 | 57 | As we already have the basic console I will implement the graphical representation of all details activities of each particular process by converting each activity name and status to its corresponding graphical component which include a small customised icon as showing at the top-right corner of each activity to directly let the user know of the status of the activity. and also we will make it possible for a user to get all informations concerning an activity as he pass the mouse over it. 58 | 59 | the figure below is the better illustration of it. 60 | 61 | ![image alt text](media/image_0.png) 62 | 63 | the two black circles with the play and pause symbol at the top-right is representing the zoom of the small blue icons in each activity and can be the icon of any status 64 | 65 | **Proposed Approach (Algorithm)** 66 | 67 | We know that form the .bpel file it is possible to fetch the name and the status of each activity, by using a two dimensional mapped array to initialize all the possibility cases to easily display the correct graphical component as explained in the algorithm snapshot; 68 | 69 | /* create and initialise our array here */ 70 | 71 | **var map_array[][] = "";** 72 | 73 | **["activity1"][“status1”] = component1;** 74 | 75 | **["activity1"][“status2”] = component2;** 76 | 77 | **["activity1"][“status3”] = component3;** 78 | 79 | **["activity1"][“status4”] = component4;** 80 | 81 | **….** 82 | 83 | **…..** 84 | 85 | **….** 86 | 87 | **….** 88 | 89 | /* get value of a and s from the bpel file we can handle it like this */ 90 | 91 | **function(a,s) { **// where a and s represent respectively "activity" and “status” 92 | 93 | /* now from value of "activity" and “status” passing as arg to the function we can display the corresponding component*/ 94 | 95 | **alert(map_array[a][s]);** 96 | 97 | /* now for error handler we can check it as follow */ 98 | 99 | **if(map_array[a][s] == noexist) {** 100 | 101 | /* if the activity and status does not exist we */ 102 | 103 | /*display a specific component for that** */** 104 | 105 | ** alert(spacify_component);** 106 | 107 | **}** 108 | 109 | **}** 110 | 111 | **Schedule and Timeline** 112 | 113 | I propose to release the beta version of the project after 8 weeks and update it (from the user's feedback) using agile development methods. I will be able to devote at least 6 hours per day on weekdays and 10 hours per day during the weekend to get at least 40 hours per week. 114 | 115 | March 27 - April 27: 116 | 117 | (4 weeks) 118 | 119 | * Study about the Extensible Stylesheet Language Transformations(XSLT) language 120 | 121 | * Study of the AngularJs ode-console repository [https://github.com/kamdjouduplex/ode-console](https://github.com/kamdjouduplex/ode-console) 122 | 123 | * Further studies about the WS-BPEL 124 | 125 | April 28 - May 25: Community bonding period 126 | 127 | (4 weeks) 128 | 129 | * Study ODE/BPEL manuals,tutorials series and documentation. 130 | 131 | * Compile ODE-CONSOLE source code ,Study code base and remove bugs. 132 | 133 | * Discuss with other developers and ODE/BPEL mentors to define mailing-list etiquette 134 | 135 | * Acquaint myself with necessary technologies like HTML5 canvas/svg and Extensible Stylesheet Language Transformations (XSLT) 136 | 137 | May 26 - July 24: Work period (Pre-mid term evaluation) 138 | 139 | (2 weeks) 140 | 141 | * Implementation of added graphical components 142 | 143 | * Integrating those components to the fonts of the project 144 | 145 | (2 weeks) 146 | 147 | * Using AngularJs to bind activities’ status changed to auto layout component 148 | 149 | * Integrating vertices feature to related processes activities among them 150 | 151 | July 25 - August 12: Pencils down period (Testing and Documentation) 152 | 153 | (2 weeks) 154 | 155 | * Final testing and debugging of ode-console code. 156 | 157 | * Final touching 158 | 159 | August 13 -August 21: Pencils down (Documentation and Final submission) 160 | 161 | (1 week) 162 | 163 | * Documenting the Process Instance Visualization for Monitoring Console code 164 | 165 | * Final submission of ode-console code to Google 166 | 167 | **5- ****W****hy**** APACHE ****SOFTWARE**** FOUNDATION ?** 168 | 169 | First of all, I really believe that software can change the world by providing new technologies and that software should be free. I choose APACHE FOUNDATION because it is a not-for-profit technology organization which offers me the opportunity to assist the world of web services (WS) with tools which help users orchestrating and monitoring their processes without any stress . Working with APACHE/ODE also helps me contribute over the long-term and gain status within the hacker community based on my Engineering/Designing background and academic interests. Although I have not contributed before to open source ,I see implementing a graphical visualization as a long-awaited opportunity to provide more easier appreciation of processes orchestration for the APACHE/ODE software and a jump-start to my continued contribution to open source software through APACHE FOUNDATION.This project will also help APACHE FOUNDATION and other organizations dealing with Web Services to integrate the graphical representation of activities on any other project. 170 | 171 | ## 6- Why am I the right person to take this project ? 172 | 173 | I am passionate about open source technologies and i have used the Apache Web server together with other open source technologies for my personal and class projects. I believe in the open source culture of sharing and will like to give back to the Open source community by my participation in this year's GSOC 2015. 174 | 175 | An African contributing to open source technology will go a long way to increase technology awareness in our continent and will drive the innovators of the future. 176 | 177 | -------------------------------------------------------------------------------- /GSoC-2016/Accepted/lowRISC-omasanori-Porting-musl-libc-to-RISC-V/lowRISC-omasanori-Porting-musl-libc-to-RISC-V.md: -------------------------------------------------------------------------------- 1 | --- 2 | title: 3 | Porting musl libc to RISC-V 4 | author: 5 | Masanori Ogino 6 | date: 7 | \today 8 | abstract: 9 | musl is an implementation of the C standard library for Linux. It is clean, 10 | robust, efficient and conformant with most of the ISO C11 and POSIX 2008 11 | standards. This project aims to port musl to RISC-V to provide an attractive 12 | alternative to glibc for the open instruction set architecture ecosystem. 13 | ... 14 | 15 | # Motivation 16 | 17 | Newlib and glibc have already been ported to the RISC-V ISA. The former is used 18 | for bare-metal programs and the latter is for Linux-based systems. While glibc 19 | is the most popular Linux libc for servers and PCs, there are some alternative 20 | implementations. 21 | 22 | [musl](http://www.musl-libc.org/) is an implementation of libc for Linux. musl 23 | is used by some Linux distribution including Alpine Linux, Sabotage Linux and 24 | Void Linux as their primary libc. musl features an excellent support for static 25 | linking while it supports dynamic linking efficiently too. According to 26 | "[Comparison of C/POSIX standard library implementations for 27 | Linux](http://www.etalabs.net/compare_libcs.html)", the code size of musl is 28 | much smaller than that of glibc for static linking. 29 | 30 | While musl focuses on code size, it offers most of the C APIs in ISO C11 and 31 | POSIX 2008 standards with some extensions provided by BSD libc and glibc. musl 32 | is designed to be robust, e.g. consideration to algorithmic complexity attacks, 33 | avoiding unnecessary dynamic allocation, etc. 34 | 35 | The goal of this project is porting musl to RISC-V. Due to its code size and 36 | robustness, it will become an attractive alternative to glibc especially for 37 | developers of embedded systems. It will also involve opportunities to find 38 | failures in other layers: kernels, toolchains, simulators or hardware. 39 | Moreover, its simple and clean code base will yield benefits for both 40 | production and education use. 41 | 42 | # Related Work 43 | 44 | There is a [RISC-V port of musl](https://github.com/lluixhi/musl-riscv) by Aric 45 | Belsito. Although it is not merged to the upstream tree, it already has most of 46 | the mandatory part of porting and it looks good to me. Thus, I will use the 47 | port as the basis. 48 | 49 | # Targets 50 | 51 | RISC-V is a modular ISA: it consists of base instruction sets and some standard 52 | extensions. This proposal assumes RV32G and RV64G as main targets; they are 53 | 32-bit and 64-bit variants of RISC-V ISA with multiplication and division 54 | instructions, atomic instructions and floating-point instructions with binary32 55 | and binary64 formats in IEEE 754-2008. However, some deliverables may assumes 56 | other prerequisites. 57 | 58 | The [Spike RISC-V simulator](https://github.com/riscv/riscv-isa-sim) and/or 59 | QEMU will be used for testing. 60 | 61 | # Deliverables 62 | 63 | ## Basic RISC-V Support [required] 64 | 65 | This is the essential part of the proposal. It contains an implementation of 66 | mandatory `arch/${ISA}` portion of musl for RISC-V (both the 32-bit and 64-bit 67 | variants) and related changes to the toolchain. After that, we can build and 68 | run some programs on Linux on RISC-V with musl. Also, we can use libc-test and 69 | libc-bench tools to measure the quality of the port then. 70 | 71 | Since the code will be derived from the existing port, bug fixes and 72 | implementation of missing part will be the main tasks on this field. 73 | 74 | ## Optimized Implementation of Certain Functions [required] 75 | 76 | musl provides the way to replace generic implementations to ones for a 77 | particular architecture. It is mainly used to provide an optimized version of 78 | mathematical functions, string manipulation functions or so on. For example, 79 | musl provides a custom version of `log2` using the `fyl2x` instruction on x86. 80 | 81 | While practical optimization requires careful measurement with actual hardware, 82 | I do not have any working hardware yet. Thus, the optimization will be limited 83 | to some cases. For example, the use of special instructions or great reduction 84 | of instruction cycles is viable. Optimization will be done not only on musl 85 | itselft, but also on the toolchain if needed. 86 | 87 | ## Support for Processors without Floating-point Instructions [required] 88 | 89 | As I specified in the "Targets" section, I choose RISC-V processors with 90 | floating-point units (FPUs) as the primary targets. However, some RISC-V 91 | microcontrollers omit FPUs as permitted by the ISA specification. 92 | 93 | In addition to the primary targets, it is worthwhile to test musl against the 94 | software floating-point (soft-float) ABI and fix errors in order to support 95 | such microcontrollers. 96 | 97 | ## Support for Processors without Atomic Instructions [required] 98 | 99 | As I specified in the "Targets" section, I choose RISC-V processors with atomic 100 | instructions. However, the ISA specification permits processors to omit such 101 | instructions specified in the A standard extension. 102 | 103 | musl requires compare-and-swap (CAS) or other atomic memory operation that can 104 | be used for implementing CAS. Thus, an additional work is required to support 105 | processors without atomic instructions. 106 | 107 | ### Migration of CAS Emulation System Calls to vDSO Functions [optional] 108 | 109 | The [RISC-V fork of Linux kernel](https://github.com/riscv/riscv-linux) 110 | provides CAS emulation system calls for processors without atomic instructions 111 | and glibc calls them if it is built for such processors. It costs CPU time for 112 | interruption and context switching every time CAS is performed. 113 | 114 | Moreover, when a system is ported to processors with atomic instructions, we 115 | need to recompile the library to gain the advantage of atomic instructions. The 116 | situation is getting worse if we use static-linked binaries; we need to rebuild 117 | the whole system to eliminate unnecessary system calls. Also, on non-SMP 118 | systems, CAS can be achieved by much simpler way than what we do for now. 119 | 120 | The virtual dynamic shared object (vDSO) is a mechanism to avoid system calls 121 | in such situation. With vDSO, userland has to call functions exported from 122 | kernel and the functions will choose the best way on the current setup. 123 | Implementing vDSO functions for CAS emulation will benefit not only musl-based 124 | systems but also glibc-based systems if the libc uses them. 125 | 126 | ## Automatic Testing Infrastructure [required] 127 | 128 | Since the software stack of RISC-V is still in heavy development, it is desired 129 | to have an infrastructure to test the port automatically against latest 130 | simulator, toolchain, kernel and so on. Then we will be able to find when 131 | something goes wrong. 132 | 133 | At the beginning, it will be made of bunch of scripts to checkout the sources, 134 | build the tools and rootfs image, run the simulator and save the result into a 135 | file. It will run on my machine for a while, but it would be better to have a 136 | place in either of the projects. 137 | 138 | ## Improvement of musl Porting Guide [required] 139 | 140 | As I mentioned before, musl has a porting document. However, recent changes 141 | made it obsolete, while the majority is still useful now. Also, the document is 142 | made of links to memos, archives of the mailing list, commits, ISA vendors' 143 | official documents and so on. 144 | 145 | It would be better to have an up-to-date and self-contained porting guide 146 | describing topics that is common between architectures on the official wiki. 147 | 148 | ## Preliminary Work toward System V ABI RISC-V Processor Supplement [optional] 149 | 150 | On other architectures, modern Unix systems use the System V ABI Processor 151 | Supplement (psABI) documents for the reference of calling conventions, 152 | interface with operating systems, the value of ELF field headers and so on. In 153 | contrast, there is no such documents for RISC-V, though Linux and some \*BSDs 154 | have already been ported to RISC-V. The user-level ISA specification only 155 | specifies some aspect of calling conventions. It will benefit software 156 | developers to have a complete psABI document. 157 | 158 | It is too hard for a twelve-weeks project to discuss and specify whole psABI 159 | while the porting is underway. Thus I will write some documents describing the 160 | current ABI for a starting point toward actual psABI. I will survey and 161 | discuss the ABI with other developers during porting anyway, so the work will 162 | be a summary of my research and the discussions. 163 | 164 | ## Others 165 | 166 | As part of the project, I may contribute other patches to related projects, 167 | e.g. tools provided by the RISC-V project, musl, libc-test, etc. 168 | 169 | # Schedule 170 | 171 | ## Community Bonding Period (April 22 -- May 22) 172 | 173 | - Building tools, kernel and rootfs with glibc, and then executing them. 174 | - Reading musl, trying to fix trivial problems and discussing them. 175 | - Reading patches in 176 | [riscv-gnu-toolchain](https://github.com/riscv/riscv-gnu-toolchain) to 177 | understand RISC-V specific part of the toolchain. 178 | - Reading patches in [musl-cross](https://bitbucket.org/GregorR/musl-cross) to 179 | understand musl specific part of the toolchain. 180 | - Research the existing port. 181 | - Communicating with other developers to understand the details. 182 | 183 | ## Week 1 -- 3 (May 23 -- June 12) 184 | 185 | - Integrate the existing port to the toolchain. 186 | - Implement the missing features for RV64G. 187 | - Fix bugs blocking the integration. 188 | 189 | ## Week 4 (June 13 -- June 19) 190 | 191 | - Build rootfs with musl-based toolchain. 192 | - Fix userland code if it depends on glibc. 193 | - Run Linux with musl-based userland on the simulator. 194 | - Debug the port if the system does not launch. 195 | 196 | ## Week 5 (June 20 -- June 26) 197 | 198 | - Build [libc-test](http://nsz.repo.hu/git/?p=libc-test) and execute it. 199 | - Check the result of the test and compare with x86. 200 | - Submit the patch to the mailing list for reviewing. 201 | - Update the patch in response to the result of reviewing. 202 | - Prepare and submit the mid-term evaluation. 203 | 204 | ## Week 6 -- 7 (June 27 -- July 10) 205 | 206 | - Write test automation scripts. 207 | - Implement the missing features for RV32G. 208 | - Build the toolchain with multilib setting. 209 | - Fix bugs blocking multilib. 210 | 211 | ## Week 8 (July 11 -- July 17) 212 | 213 | - Build [libc-bench](http://git.musl-libc.org/cgit/libc-bench/) and execute it. 214 | - Test the code on RV32IMA and RV64IMA (soft-float) settings. 215 | - Fix bugs for soft-float settings. 216 | 217 | ## Week 9 -- 10 (July 18 -- July 31) 218 | 219 | - Implement atomic operation primitives for `-mno-atomic`. 220 | 221 | - Test the code on `-mno-atomic` settings. 222 | - Submit the patches to the mailing lists for reviewing. 223 | - Update the patches in response to the result of reviewing. 224 | 225 | ## Week 11 -- 12 (August 1 -- August 14) 226 | 227 | - Work on optimization. 228 | - Write the porting guide. 229 | 230 | ## Week 13 -- 14 (August 15 -- August 23) 231 | 232 | - Work on the remaining works. 233 | - Clean up the code. 234 | - Prepare and submit the final evaluation and code samples. 235 | 236 | # Biographical Information 237 | 238 | I am an information science master's student in Japan. My field of study is 239 | coding theory. In particular, data compression for the output data of particle 240 | detectors was the main theme of my bachelor thesis. 241 | 242 | I am interested in how operating systems work. I wrote a tiny (less than 3 243 | kSLOC) OS with cooperative multitasking and serial I/O written in C for a H8 244 | microcontroller as a hobby. I choose Windows and Linux for daily use. However, 245 | sometimes I try to install, play with and read implementations of other kinds 246 | of OSs like Hurd, \*BSDs, OpenIndiana and so on. I have no experience in 247 | large-scale OS development, though. 248 | 249 | I have participated in OSS projects for few years. I have contributed 50+ 250 | patches to the [Rust programming language](https://www.rust-lang.org/) and 251 | written part of the [Rust overlay for Gentoo 252 | Linux](https://github.com/gentoo/gentoo-rust). I have contributed some patches 253 | to [compiler-rt](http://compiler-rt.llvm.org/) and other projects too. 254 | -------------------------------------------------------------------------------- /GSoC-2015/Proposed/Mozilla-ManishG-FormSupportServo/Mozilla-ManishG-FormSupportServo.md: -------------------------------------------------------------------------------- 1 | > Mock proposal: 2 | > 3 | > *(Written from the point of view of a student who has done their research, perhaps asking around in IRC and/or sifting through the spec/code. The actual project doesn't need to match this, though hopefully this proposal will be useful to the student who will finally be working on this, if any)* 4 | 5 | ### Personal Details 6 | 7 | - ... 8 | 9 | ### Project Proposal 10 | 11 | > I wish to work on improving form support in Servo. I shall be implementing the following features: 12 | 13 | - [Form validation](https://html.spec.whatwg.org/multipage/forms.html#constraints) (minlength/maxlength/size/required/pattern/min/max) 14 | 15 | - Form controls: 16 | 17 | - select, option, optgroup: These can be implemented as tables, to bring us limited functionality. 18 | 19 | - label: This requires some[ activation hooks](https://html.spec.whatwg.org/multipage/forms.html#the-label-element), which can be added by implementing Activatable similar to[ htmlinputelement.rs](http://mxr.mozilla.org/servo/source/components/script/dom/htmlinputelement.rs#612) 20 | 21 | - Improving existing ones: I could create better-looking widgets for radio buttons, buttons, and checkboxes via CSS 22 | 23 | - datalist: If I get time, I can try to add table-based datalist support too. The UI code might also be useful for providing limited autocompletion support 24 | 25 | - File uploads: 26 | 27 | - Support basic[ ](http://dev.w3.org/2006/webapi/FileAPI/#dfn-Blob)[*Blob*](http://dev.w3.org/2006/webapi/FileAPI/#dfn-Blob) and[ ](http://dev.w3.org/2006/webapi/FileAPI/#dfn-file)[*File*](http://dev.w3.org/2006/webapi/FileAPI/#dfn-file) interfaces that are able to contain data. These can then be passed around internally; there is no need for advanced[ file reading](http://dev.w3.org/2006/webapi/FileAPI/#reading-data-section) . File names and an internal API for accessing the data is enough. 28 | 29 | - If there is time, support[ ](http://dev.w3.org/2006/webapi/FileAPI/#dfn-filereader)[*FileReader*](http://dev.w3.org/2006/webapi/FileAPI/#dfn-filereader). 30 | 31 | - Add support for proper[ ](http://localhost:8000/multipage/forms.html#multipart/form-data-encoding-algorithm)[multipart/form-data](http://localhost:8000/multipage/forms.html#multipart/form-data-encoding-algorithm) serialization, implementing[ ](http://tools.ietf.org/html/rfc2388)[*RFC 2388*](http://tools.ietf.org/html/rfc2388)*.* I may only focus on UTF8 encoding and expand to other encodings if there is time. 32 | 33 | - Support a basic File upload widget; which looks like a text input. Dialog support is not a part of this project. Improve the submission algorithm in[ ](http://mxr.mozilla.org/servo/source/components/script/dom/htmlformelement.rs)[*htmlformelement.rs*](http://mxr.mozilla.org/servo/source/components/script/dom/htmlformelement.rs) to support this. 34 | 35 | - Improved editing for text controls: 36 | 37 | - Basic keyboard shortcuts: Ctrl-A, Ctrl-V, Ctrl-C. 38 | 39 | - Caret support. Initially this can just be a phantom pipe char in the text field, if there is time we can try to integrate this better into layout. 40 | 41 | - If there is time, support for moving the caret by click 42 | 43 | - If there is time, advanced selection support. 44 | 45 | - *I wonder how this will interact with bidi and the rest. Though that's way out of scope for gsoc.* 46 | 47 | - Tests: Add tests to[ web-platform-tests](https://github.com/w3c/web-platform-tests/) wherever missing 48 | 49 | - *We'll probably need to figure out a way to properly test stuff like activation; this might also be out of scope of gsoc since it can't be done with regular javascript.* 50 | 51 | > *It would probably be nice to get spec-complete form owner support as well in this project. However this isn't something that would make sense in the proposal (too obscure/technical — a student won't be aware of the issue whilst writing the proposal), but I think it would be a fun side-adventure for the student halfway through the project. It involves discussing the spec with Hixie and figuring out how h5e handles the “in a form” state, and then figuring out a way to track form owners efficiently (the spec seems to want us to update form owners on every DOM mutation).* 52 | 53 | ### Schedule of Deliverables 54 | 55 | > The coding period is around 12-13 weeks. Since most tests are already written, most weeks will involve testing code written that week even if not explicitly mentioned. 56 | 57 | - Week 1: 58 | 59 | - Work on reading the[ HTML Forms spec](http://localhost:8000/multipage/forms.html), and comparing it with existing code. 60 | 61 | - Read through[ ](https://mxr.mozilla.org/servo/source/components/script/dom/activation.rs)[*activation.rs*](https://mxr.mozilla.org/servo/source/components/script/dom/activation.rs),[ ](http://mxr.mozilla.org/servo/source/components/script/dom/htmlinputelement.rs)[*htmlinputelement.rs*](http://mxr.mozilla.org/servo/source/components/script/dom/htmlinputelement.rs),[ ](http://mxr.mozilla.org/servo/source/components/script/dom/htmlformelement.rs)[*htmlformelement.rs*](http://mxr.mozilla.org/servo/source/components/script/dom/htmlformelement.rs),[ ](http://mxr.mozilla.org/servo/source/components/script/textinput.rs)[*textinput.rs*](http://mxr.mozilla.org/servo/source/components/script/textinput.rs). 62 | 63 | - Week 2: 64 | 65 | - Start work on form validation. Add the relevant attributes to the webidl for *\*, *\*, and the other widgets. Add a basic[ constraint validation](http://localhost:8000/multipage/forms.html#constraints) method that validates a select few of the constraints. Hook it up to[ ](http://localhost:8000/multipage/forms.html#the-form-element:statically-validate-the-constraints)[*checkValidity()*](http://localhost:8000/multipage/forms.html#the-form-element:statically-validate-the-constraints). 66 | 67 | - *I feel that this is a good thing to start with since it gives a nice introduction to introducing new DOM attributes and manipulating them.* 68 | 69 | - Fill in the implementation of [*Blob*](http://dev.w3.org/2006/webapi/FileAPI/#dfn-Blob) and[ ](http://dev.w3.org/2006/webapi/FileAPI/#dfn-file)[*File*](http://dev.w3.org/2006/webapi/FileAPI/#dfn-file) with stubs for the attributes we need. 70 | 71 | - *The constructor needs sequence\<\> webidl support which we don't have right now IIRC, but I think we can add it by gsoc. It's not really necessary for this project either way.* 72 | 73 | - *Not sure about it being easy; the correct implementation uses for..of iteration through a C++ class.* 74 | 75 | - Finish reading the forms spec. 76 | 77 | - Week 3: 78 | 79 | - Add more constraints to the validation algorithm. 80 | 81 | - Add the[ ](http://localhost:8000/multipage/forms.html#dom-cva-validity)[*ValidityState*](http://localhost:8000/multipage/forms.html#dom-cva-validity) interface and support some of the attributes (not necessarily the methods) 82 | 83 | - Add simple[ interactive reporting](http://localhost:8000/multipage/forms.html#interactively-validate-the-constraints) of errors, perhaps by scrolling the user to the element. 84 | 85 | - Read through [*RFC 2388*](http://tools.ietf.org/html/rfc2388) and get a rough idea of how form data serialization is to be done 86 | 87 | - Week 4: 88 | 89 | - Add support for Blob/File to contain data or have a reference to the file. 90 | 91 | - *We are*[ ](http://dev.w3.org/2006/webapi/FileAPI/#file)[**allowed to**](http://dev.w3.org/2006/webapi/FileAPI/#file) *skip snapshotting the file, though we probably should maintain a snapshot depending on what other browsers do* 92 | 93 | - Start looking into how select/option can be styled. 94 | 95 | - Write basic[ ](http://localhost:8000/multipage/forms.html#htmlselectelement)[*\*](http://localhost:8000/multipage/forms.html#htmlselectelement)/[*\*](http://localhost:8000/multipage/forms.html#htmloptionelement) webidls, copying/sharing most of the method implementations from \ wherever possible. 96 | 97 | - Work more on validity if there is time 98 | 99 | - Week 5: 100 | 101 | - Write a proper[ ](http://localhost:8000/multipage/forms.html#multipart/form-data-encoding-algorithm)[*multipart/form-data*](http://localhost:8000/multipage/forms.html#multipart/form-data-encoding-algorithm) serialization algorithm. 102 | 103 | - Add support for[ ](http://localhost:8000/multipage/forms.html#htmllabelelement)[*\*](http://localhost:8000/multipage/forms.html#htmllabelelement)*,* along with the activation hooks. 104 | 105 | - If there is time, try adding a table-based widget for [*\*](http://localhost:8000/multipage/forms.html#htmlselectelement)/[*\*](http://localhost:8000/multipage/forms.html#htmloptionelement)[](http://localhost:8000/multipage/forms.html#htmloptionelement) 106 | 107 | - Week 6: 108 | 109 | - Finish widgets for [*\*](http://localhost:8000/multipage/forms.html#htmlselectelement)/[*\*](http://localhost:8000/multipage/forms.html#htmloptionelement)[](http://localhost:8000/multipage/forms.html#htmloptionelement) 110 | 111 | - *Basically, layout will be asked to treat these as tables of a certain kind. The student need not add dropdown support, a simple table showing all the options is enough (preferably with a scrollbar, if that's doable in CSS).* 112 | 113 | - Create some better CSS for existing widgets, and start using them. 114 | 115 | - *The challenge here is in writing these in CSS that we support, and still having them look decent. Hopefully we'll have better CSS support by this time anyway; though looking at the styled buttons on GitHub in Servo we already support enough CSS for this.* 116 | 117 | - Week 7: 118 | 119 | - Implement the form submission algorithm for[ ](http://localhost:8000/multipage/forms.html#multipart/form-data-encoding-algorithm)[*multipart/form-data*](http://localhost:8000/multipage/forms.html#multipart/form-data-encoding-algorithm) . Hook it up to regular form submission. 120 | 121 | - *The web isn't supposed to be able to mess with form uploads so testing will be hard. Since we're just using a text input, we can temporarily write content tests for this until we figure out how to properly test things that are hidden to Javascript.* 122 | 123 | - *As*ide: *if we* are *exposing form uploads to javascript, perhaps we should be marking this as a testing-only feature or something so that this doesn't compromise the security of dogfooders and users of the “mobile Wikipedia browser” if we get to making that happen.* 124 | 125 | - Start looking into how to add caret support 126 | 127 | - Week 8: 128 | 129 | - Finish up remaining bits of validation, if any 130 | 131 | - Learn how to hook into the clipboard for copy/paste functionality 132 | 133 | - Polish up existing selection support if necessary (internals, not UI) 134 | 135 | - Add support for Ctrl-A 136 | 137 | - Week 9: 138 | 139 | - Add support for Ctrl-C, Ctrl-V 140 | 141 | - *Might be hard to hook into the clipboard cross-platform, so this might take longer* 142 | 143 | - Add caret support as a simple rendered pipe, or as something more sophisticated if possible in the timeframe. 144 | 145 | - Week 10: 146 | 147 | - Look into adding the ability to set caret position based on click 148 | 149 | - Look into adding styling for \ 150 | 151 | - Fill in more parts of the File spec 152 | 153 | - Week 11: 154 | 155 | - Look into support for visible selections 156 | 157 | - Work on fixing any web-platform-test failures related to my code 158 | 159 | - Week 12: 160 | 161 | - Polish up and document remaining code 162 | 163 | - Continue to work on tests 164 | 165 | ### Open Source Development Experience 166 | 167 | > None 168 | 169 | ### Work/Internship Experience 170 | 171 | > None 172 | 173 | ### Academic Experience 174 | 175 | > I have a bachelor’s degree in multiplication by 3. 176 | 177 | ### Why Me 178 | 179 | > I am awesome. 180 | 181 | ### Why Mozilla 182 | 183 | > Cake. 184 | > 185 | > Random notes: 186 | 187 | - Chromium widget theming:[ ](https://code.google.com/p/chromium/codesearch#chromium/src/third_party/WebKit/Source/core/rendering/RenderThemeChromiumMac.mm)[*https://code.google.com/p/chromium/codesearch\#chromium/src/third\_party/WebKit/Source/core/rendering/RenderThemeChromiumMac.mm*](https://code.google.com/p/chromium/codesearch#chromium/src/third_party/WebKit/Source/core/rendering/RenderThemeChromiumMac.mm) 188 | 189 | - mrobinson says that OS-specific theming is a nightmare 190 | 191 | 192 | -------------------------------------------------------------------------------- /GSoC-2015/Accepted/SaketC-statsmodels-MixedModels/SaketC-statsmodels-MixedModels.md: -------------------------------------------------------------------------------- 1 | **statsmodels - Python Software Foundation Google Summer of Code 2015** 2 | 3 | **Improvements to Mixed Effects Models** 4 | 5 | 6 | 7 | 8 | 9 | ## **Sub-organization:** statsmodels 10 | 11 | ## Student Information 12 | 13 | * **Name**: Saket Choudhary 14 | 15 | * **Email**: [saketkc@gmail.com](mailto:saketkc@gmail.com) 16 | 17 | * **Telephone**: +1-213-477-3770 18 | 19 | * **Time zone**: GMT -0800 Pacific Time Zone 20 | 21 | * **IRC handle**: [saketkc@irc.freenode.net](mailto:saketkc@ircfreenode.net) 22 | 23 | * **Source control usernames**: 24 | 25 | * **Github**: [https://github.com/saketkc/](https://github.com/saketkc/) 26 | 27 | * **Bitbucket**: [https://bitbucket.org/saketkc](https://bitbucket.org/saketkc) 28 | 29 | * **IM**: [saketkc@gmail.com](mailto:saketkc@gmail.com) 30 | 31 | * **Twitter**: [https://twitter.com/saketkc](https://twitter.com/saketkc) 32 | 33 | * **Home Page**: [http://saket-choudhary.me](http://saket-choudhary.me) 34 | 35 | * **Blog**: 36 | 37 | * [http://statsmodels-mlm-gsoc2015.blogspot.com/](http://statsmodels-mlm-gsoc2015.blogspot.com/) 38 | 39 | * **GSoC Blog RSS feed**: 40 | 41 | * [http://statsmodels-mlm-gsoc2015.blogspot.com/feeds/posts/default?alt=rss](http://statsmodels-mlm-gsoc2015.blogspot.com/feeds/posts/default?alt=rss) 42 | 43 | * **Other personal information you think we would find relevant**: 44 | 45 | * I was part of GSoC’12 and worked on improving the Slideshow publishing API for OERPub with [Connexions](http://cnx.org/) a Rice University project: [[Proposal](http://www.google-melange.com/gsoc/proposal/public/google/gsoc2012/saketkc/5662278724616192)] 46 | 47 | * As part of GSoC’13, I worked for [Galaxy Project](https://wiki.galaxyproject.org/FrontPage), working on python codebase for implementing changes to the workflow system. A major part of my code didn’t make it to the codebase, I however still contribute to the project. We had a [preprint](http://biorxiv.org/content/early/2014/10/19/010538) submitted for part of our work too. [[Proposal](https://docs.google.com/document/d/1CT7cEArKg2VOY3EPFcNiMNjzMxyKZ1WmgOSMb_wnLPI/edit)] 48 | 49 | * I also participated in GSoC’14 with [BioJS](http://biojs.net/) and implemented a [Human Genetic Variation Viewer](http://biojs.io/d/biojs-vis-hgv) , a manuscript is under submission. [[Proposal](https://docs.google.com/document/d/1CT7cEArKg2VOY3EPFcNiMNjzMxyKZ1WmgOSMb_wnLPI/edit?usp=sharing)] 50 | 51 | * I also contribute occasionally to scipy, pgmpy, homebrew-science 52 | 53 | ## University Information 54 | 55 | * **University**: University Of Southern California 56 | 57 | * **Major**: Computational Biology & Bioinformatics 58 | 59 | * **Current Year**: 1st 60 | 61 | * Expected Graduation date: 2019 62 | 63 | * Degree: Ph.D 64 | 65 | 66 | 67 | ## Project Proposal Information 68 | 69 | * **Proposal Title**: Improvements to Mixed Effects Model 70 | 71 | * **Proposal Abstract**: 72 | 73 | * statsmodels.regression.mixed_linear module is relatively new. Mixed effects models are used to often model experiments where a study has been repeated on an individual multiple times in order incorporate fixed-effect parameters and the unobserved random effects. Applications of LMM involve modeling longitudinal data in order to model random effects such as modeling response times for individuals under the influence of different types of drugs. 74 | 75 | 76 | 77 | * From a user point of view the main features that have been lacking include: 78 | 79 | * **Variance Component models**: Though distinct random effects can be constrained to be independent, the variances are not constrained to be equal 80 | 81 | * **Heteroscedastic Residual errors: **Related issue(tangentially): [https://github.com/statsmodels/statsmodels/issues/1948](https://github.com/statsmodels/statsmodels/issues/1948) 82 | 83 | * **Crossed Random Effects/Nested Random Effects**: The current model of mixed_linear module allows to model only random effect arising out of single factor. Cross-classified data where several factors are expected to have random effects, thus can’t be modeled. Examples of such studies will include gene expression studies where certain set of genes from different individuals are put under certain categories of stress and their expression level is measured. Patients constitute a random sample of the population and so does the level making such studies more cross-design suitable. Nested random effects can be then taken care of implicitly, where each ‘sample’ might be ‘nested’ with other covariates. This should also implicitly support for uncorrelated random effects. Related issue: [https://github.com/statsmodels/statsmodels/issues/1946](https://github.com/statsmodels/statsmodels/issues/1946) 84 | 85 | 86 | 87 | Few discussions demanding these features are at: [https://github.com/statsmodels/statsmodels/issues/646](https://github.com/statsmodels/statsmodels/issues/646) 88 | 89 | and [https://groups.google.com/forum/#!topic/pystatsmodels/CrHCZkIWj4w](https://groups.google.com/forum/#!topic/pystatsmodels/CrHCZkIWj4w) 90 | 91 | * **Deliverables/End Goals & Challenges** 92 | 93 | * **Support for crossed random effects:** This would probably be the most challenging part. The plan is to implement a non-cholmod based way to compute the cholesky decomposition of the error covariance matrix D. Two known approaches are documented in references [1] and [2]. This will also require significant benchmarking against lme4 methods. General methodology is detailed in [12] 94 | 95 | * **Support for variance component models, heteroscedastic residuals**: nlme supports heteroscedasticity[6]. The implementation details are in [3]. This will also possibly require using sparse methods and hence would depend on the above. A non sparse and slower implementation, is however independent from the above. 96 | 97 | * **Support for other MLE and REML estimation methods**: `lmm` package in R[11] implements several methods for rapid MLE and REML convergence. Two methods that could likely be ported to statsmodels include: `fastml` and `fastmode` and `fastrml` 98 | 99 | * **Support for nonlinear mixed effect models**: This would be an optional goal, given nonlinear mixed models have too specific use cases[4] 100 | 101 | * **Proposal Detailed Description/Timeline**: 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 113 | 115 | 116 | 117 | 122 | 125 | 126 | 127 | 133 | 137 | 138 | 139 | 144 | 149 | 150 | 151 | 155 | 159 | 160 | 161 | 162 | 169 | 170 | 171 | 176 | 180 | 181 | 182 | 187 | 191 | 192 | 193 | 197 | 202 | 203 | 204 | 208 | 212 | 213 | 214 | 218 | 224 | 225 | 226 | 230 | 234 | 235 | 236 | 237 | 241 | 242 |
WeekTasks
110 | Community Bonding Period 111 | 112 | 27 April - 24 May 114 | -- Chalk out all possible non-cholmod ways of handling sparse cases. This would require understanding the discussions in [1] and [2], taking mentors and statsmodels community’s input.
118 | Week 1 119 | 120 | May 25 - May 31 121 | 123 | -- Support for heteroscedastic residual errors 124 | -- Documentation and Unit tests
128 | Week 2 and Week 3 129 | 130 | June 1- June 7, 131 | June 8 - June 14 132 | 134 | -- Implement generic class for performing non-cholmod based cholesky decompositions 135 | 136 | -- Documentation and Unit tests
140 | Week 4, Week 5 141 | 142 | June 15 - June 21, 143 | June 22 - June 28 145 | -- Implementation of crossed random effects support using non-cholmod support 146 | 147 | -- Tests and documentation 148 |
152 | Week 6 153 | June 29 - July 5 154 | 156 | -- Implement support for variance components models for constrained random effects 157 | 158 | -- Tests and Documentation
Midterm Deliverables 163 | -- A generic non cholmod for sparse calculations 164 | 165 | -- Support for crossed random effects 166 | 167 | -- Support for specifying variance component models 168 |
172 | Week 7 173 | 174 | July 6 - July 12, 175 | 177 | -- Port `fastml` methods from `lmm` 178 | 179 | -- Tests and documentation
183 | Week 8 184 | 185 | July 13 - July 19 186 | 188 | -- Port `fastml` methods from `lmm` 189 | 190 | -- Tests and documentation
194 | Week 9 195 | 196 | July 20 - July 26 198 | -- Port ‘fastmode` from lmm R package 199 | 200 | -- Tests and documentation 201 |
205 | Week 10 and Week 11 206 | 207 | July 27 - August 2 209 | -- Support for nonlinear models 210 | 211 | -- Tests and documentation
215 | Week 12 216 | 217 | August 3 - August 9-- Code Profiling 219 | 220 | -- iPython Notebook demos 221 | 222 | -- Tests and documentation 223 |
Week 13 227 | 228 | August 10 - August 17 229 | August 18 - August 24-- Code Profiling 231 | -- iPython Notebook demos 232 | -- Tests and documentation 233 |
Term end Deliverables-- Support for non-linear mixed effects 238 | 239 | -- IPython notebooks with examples benchmarking/comparing methods against methods from `lme4`, `lmm` and `nlme`(wherever applicable) 240 |
243 | 244 | 245 | I have deliberately kept the last two weeks as buffer periods in order to accommodate any pending/overdue tasks from previous weeks. 246 | 247 | **References**: 248 | 249 | 1. Parameter Estimation in High Dimensional Gaussian Distributions [http://www.math.ntnu.no/preprint/statistics/2012/S5-2012.pdf](http://www.math.ntnu.no/preprint/statistics/2012/S5-2012.pdf) 250 | 251 | 2. FaST linear mixed models for genome-wide association studies(See supplement): [http://www.nature.com/nmeth/journal/v8/n10/abs/nmeth.1681.html](http://www.nature.com/nmeth/journal/v8/n10/abs/nmeth.1681.html) 252 | 253 | 3. Pinheiro, J.C. and Bates, D.M. (2000). *Mixed-Effects Models in S and S-Plus*. Springer 254 | 255 | 4. Nonlinear Mixed Effects Models [http://www4.stat.ncsu.edu/~davidian/nlmmtalk.pdf](http://www4.stat.ncsu.edu/~davidian/nlmmtalk.pdf) 256 | 257 | 5. Fasta algorithms for ML and RML esitmation in linear models: [http://raptor1.bizlab.mtsu.edu/s-drive/TEFF/Rlib/library/lmm/doc/improve.pdf](http://raptor1.bizlab.mtsu.edu/s-drive/TEFF/Rlib/library/lmm/doc/improve.pdf) 258 | 259 | 6. lme4 heteroscedasticity: [https://stat.ethz.ch/pipermail/r-sig-mixed-models/2014q4/022753.html](https://stat.ethz.ch/pipermail/r-sig-mixed-models/2014q4/022753.html) 260 | 261 | 7. Current status of Mixed Linear models in statsmodes: [http://statsmodels.sourceforge.net/devel/mixed_linear.html](http://statsmodels.sourceforge.net/devel/mixed_linear.html) 262 | 263 | 8. lme4 book: [http://lme4.r-forge.r-project.org/book/](http://lme4.r-forge.r-project.org/book/) 264 | 265 | 9. lme4 implementation: [http://econ.ucsb.edu/~doug/245a/Papers/Mixed%20Effects%20Implement.pdf](http://econ.ucsb.edu/~doug/245a/Papers/Mixed%20Effects%20Implement.pdf) and MJ Lindstrom, DM Bates (1988). "Newton Raphson and EM algorithms for linear mixed effects models for repeated measures data" 266 | 267 | 10. Julia implementation: [https://github.com/dmbates/MixedModels.jl](https://github.com/dmbates/MixedModels.jl) 268 | 269 | 11. lmm in r: [http://cran.r-project.org/web/packages/lmm/lmm.pdf](http://cran.r-project.org/web/packages/lmm/lmm.pdf) 270 | 271 | 12. Mixed-effects modeling with crossed random effects for subjects and items**: **[http://www.sciencedirect.com/science/article/pii/S0749596X07001398](http://www.sciencedirect.com/science/article/pii/S0749596X07001398) 272 | 273 | * **Patches contributed to statsmodels**: 274 | 275 | * On Hold 276 | 277 | * Fix trendorder for trend only models in VAR: [https://github.com/statsmodels/statsmodels/pull/2327](https://github.com/statsmodels/statsmodels/pull/2327) 278 | 279 | * Doc fix for hazard_regression: https://github.com/statsmodels/statsmodels/pull/2236 280 | 281 | * Accepted and Merged 282 | 283 | * Check internet availability before running tests: [https://github.com/statsmodels/statsmodels/pull/2247](https://github.com/statsmodels/statsmodels/pull/2247) 284 | 285 | * Raise exception on incorrect trend type: [https://github.com/statsmodels/statsmodels/pull/2329](https://github.com/statsmodels/statsmodels/pull/2329) 286 | 287 | * Doc fixes for MixedLinear: https://github.com/statsmodels/statsmodels/pull/2333 288 | 289 | ## Other Schedule Information 290 | 291 | I will probably be taking a course during summer 2015. Besides this, I do not have any other commitments during the coding period. 292 | 293 | 294 | 295 | -------------------------------------------------------------------------------- /GSoC-2016/Accepted/PSF-Kivy-udiboy1209-Tiled-Integration-With-KivEnt/PSF-Kivy-udiboy1209-Tiled-Integration-With-KivEnt.md: -------------------------------------------------------------------------------- 1 | Python Software Foundation 2016 2 | ================================ 3 | 4 | Google Summer of Code Application 5 | -------------------------------- 6 | 7 | ###Sub-Organization: 8 | 9 | Kivy 10 | 11 | ###Mentors: 12 | 13 | Jacob Kovak, Gabriel Pettier 14 | 15 | ###Personal Details: 16 | 17 | - **Name:** Meet Udeshi 18 | - **IRC:** [*udiboy1209@irc.freenode.net*](mailto:udiboy1209@irc.freenode.net) 19 | - **Github:** [*https://github.com/udiboy1209*](https://github.com/udiboy1209) 20 | - **Email:** [*mudeshi1209@gmail.com*](mailto:mudeshi1209@gmail.com) 21 | - **Telephone:** +91-9619221240 22 | - **Time Zone:** Mumbai, IST, UTC+0530 23 | - **GSoC Blog:** [*https://udiboy1209.github.io/blog*](https://udiboy1209.github.io/blog) 24 | 25 | ###Code Contributions: 26 | 27 | I have been actively contributing to open source projects in Python, JavaScript, and Android. I have also been contributing to Kivy for a while now. Here is a list of some of my Pull Requests: 28 | 29 | To kivy/kivent: 30 | 31 | - Pull Request \#124: Implement gameview camera rotation 32 | - Pull Request \#121: Support passing moment for a body as an init argument 33 | 34 | To kivy/kivy: 35 | 36 | - Pull Request \#4055: Implement wrapping of continuous long text in TextInput 37 | - Pull Request \#4024: Always show cursor at the moment of touch 38 | - Pull Request \#4009: hint\_text in TextInput shows when focused and no text entered 39 | - Pull Request \#3963: Show disabled\_color when disabled=True for markup label 40 | - Pull Request \#3914: Implement underline and strikethrough styling for Label and MarkupLabel 41 | - Pull Request \#3698: Fix audio example not including sound files on Android 42 | 43 | You can see a complete list [*here*](https://github.com/search?utf8=%E2%9C%93&q=author%3Audiboy1209++repo%3Akivy%2Fkivy+repo%3Akivy%2Fkivent+repo%3Akivy%2Fbuildozer&type=Issues&ref=searchresults). 44 | 45 | I also maintain a few of my own open source projects. You can find a list on my website here: [*https://udiboy1209.github.io/projects*](https://udiboy1209.github.io/projects) 46 | 47 | Project Abstract: 48 | ---------------- 49 | 50 | - **Title:** Tiled Integration with KivEnt 51 | 52 | - **Description:** Tiled is a general-purpose tile map editor. It supports various tile shapes like square, hexagonal, and isometric square. Tiled support in KivEnt will be a very useful tool, given that Tiled is already a well known and feature-rich platform for creating game worlds and maps. A game developer would appreciate getting to use Tiled to create worlds with Kivy than some custom platform in-built into KivEnt. This will also make it easier (at least the map creation part) for people to port their existing games to KivEnt, from other platforms which use Tiled. The aim of this project is to create a fully-functional Tiled module which supports almost all features which Tiled currently supports, i.e. various types of tile shapes, tile animations, multiple layers, shape definitions in tiles, custom data etc. The project will also require a TMX file format loader/parser. 53 | 54 | Timeline and List of Deliverables: 55 | -------------------------------- 56 | 57 | **Community Bonding Period: Work on Animation Module** 58 | 59 | Here, I plan to work on and complete the animation module required later for the Tiled module. This should help me get a concrete grip on how the internals of kivent i.e. models, components and game systems function together. In addition to that, I will study [*this PR*](https://github.com/kivy/kivent/pull/117), where an attempt towards integrating Tiled has been started. I will contact the author, and also run the code myself to see if any of it can be used for my project. Also, I hope this period will further help me bond with the Kivy community and its members ! 60 | 61 | **23rd May - 5th June: TileSystem Boilerplate code** 62 | 63 | To start with, I will write boilerplate for a TileSystem module which renders tiles for a map (Single layer implementation). This will be the basis of the map integration module. I will code keeping in mind the features which are further planned, so that adding them later will be easier. Things planned in these two weeks are: 64 | 65 | 1. Starting with putting the TileSystem from the [*13\_tilemap example*](https://github.com/kivy/kivent/blob/2.2-dev/examples/13_tilemap/main.py) in a separate module (with a few changes like using model pointers for adding tiles instead of texture ids). Initialise tiles and textures here. 66 | 67 | 2. Creating a template data-holder class for tiles and tile textures. The tile texture template would hold data for the model to be loaded for the tile. Tile template will hold data relevant for the entity, and also a TileTexture field. 68 | 69 | 3. Adding API methods for fetching/adding/removing tiles and tile instances in TileSystem. The fetching method will return the tile entity under a coordinate point. 70 | 71 | 4. Writing a test example to generate a single-layer map (help in debugging and final demo). This should also test performance, using a performance metric of time for the update method to run. 72 | 73 | 74 | 75 | **6th June - 12th June: Animated tiles (Animation System integration)** 76 | 77 | I will integrate an animation system which I plan to finish in the community bonding period with this project. This is required to support tile animations feature in Tiled. Breaking down into discrete milestones: 78 | 79 | 1. Create a field in the TileTexture template to hold animation frame list. The data stored will be a list of frame, time pairs with frame being a pointer to the model to be displayed, and time being the time period for that frame. 80 | 81 | 2. Add the animation system as a renderer for those tile entities. I will ensure that the data stored in TileTexture is compliant with the format the animation renderer works with. 82 | 83 | 3. Create another test example to demonstrate this. Test the performance of this feature using the above metric. 84 | 85 | 86 | 87 | **13th June - 26th June: TMX Parser module** 88 | 89 | *(Points* \[a\] *and* \[b\] *are hoped to be finished when midterm evaluations start)* 90 | 91 | I will implement a TMX parser, which will automate the task of loading tiles into the tile template and adding tile sprites to the TileSystem by integrating with the templates defined above. Features to be implemented are: 92 | 93 | 1. Load all tile textures specified in tmx in the <tileset> in the TileTexture template. 94 | 95 | 2. Parse information about each tile to add to the tile entity template and the TileSystem as a renderer, with pointer to a TileTexture. Support the types of compressions TMX uses, which are base64+zlib, base64+gzip, base64 and csv. 96 | 97 | 3. Parse <frame> tags in the tile textures and map that data to the kivent supported animation format required by the animation system in TileTexture. 98 | 99 | 4. Create unit-test TMX files to check each component of the parser separately. Do a basic time and memory performance test for this parser, and improve some code if necessary. 100 | 101 | 102 | 103 | **27th June - 10th July: Multilayer Support** 104 | 105 | Implement a method for multilayer support in the module. 106 | 107 | 1. Create a LayerSystem which handles rendering all properties related to a layer in TMX correctly, like position, opacity. Each layer will have its own renderer. 108 | 109 | 2. Store the TileTexture for each layer on a single tile position in the tile entity. Add the relevant LayerSystems as renderers for this tile. 110 | 111 | 3. Add support to the TMX parser. Parser should be able to merge the tiles on multiple layers but same position into single tile entity. Create unit-test TMX files. 112 | 113 | 4. Test using a multi-layer TMX file. Check performance for this system, and the TMX parser with this feature, to see if improvements are necessary. 114 | 115 | 116 | 117 | **11th July - 24th July: Isometric and Hexagonal tiles support** 118 | 119 | Extend the above defined classes to support other map types like isometric and hexagonal. Most changes will be in the touch interactions of the TileSystem and conversion of world coordinates to map coordinates. Positioning and rendering, along with layers implementation, stays the same. The layer merging part for the parser will differ for isometric. Details: 120 | 121 | 1. Create two subclasses of TileSystem which override methods relevant to converting position of tile in the tile grid to its position in the world, i.e. those responsible for loading the position component. This, I believe is all that both tile types would require for rendering, except for some other minor changes. 122 | 123 | 2. Loading the textures from the tileset image for both tile types will require different logic, because of the difference in geometry. Thus, the TMX parser will require extra methods to load individual tiles in the tileset from the tileset image. 124 | 125 | 126 | 127 | **25th July - 7th July: Miscellaneous Features of Tiled** 128 | 129 | Tiled support will be nearly complete with this task. I will add support for miscellaneous features of Tiled such as shape drawing in <objectgroup> and in tile textures, adding <image>s and <imagelayer>s. Some of these features will only be a part of the TMX parser, because they have no relevance to a tile system. A list of these features are : 130 | 131 | 1. <objectgroup> and <object>. These allow you to define arbitrary shapes to be placed on your map, not necessarily aligned with the grid or contained within a single Tile. Types will include <ellipse>, <polygon> and <polyline> 132 | 133 | 2. <imagelayer> and <image>. These work similar to <objectgroup> when it comes to the positioning of objects. Loading textures for these kind of tags will have to be handled at init time. 134 | 135 | 3. Support custom properties for every object which are defined in the <property> tag. Create callbacks use this data, after it's been loaded, but before any other rendering takes place. 136 | 137 | 138 | 139 | **7th August - 15th August: Ensure Completion of previous Deliverables** 140 | 141 | I am keeping one week as a buffer period to account for delays in previous week’s tasks. I intend to use this period not only to finish off some pending work but also to improve on code finished hastily to keep up with the timeline (use the unit tests to check for performance and make improvements). If there are no such delays, I will start with the next task of documentation. 142 | 143 | **16th August - 23rd August: Documentation, and an Example App** 144 | 145 | I am keeping this week solely for the purpose of putting together all individual comments for each method/class together as documentation for this module. I plan on using Sphinx for compiling this documentation, and so I will write the comments and docstrings while coding each task in the format supported by it. Even though I promise to document each week’s code in a blog post as GSoC requires, and also put in appropriate comments and docstrings as and when I create them, I believe they will be made in a much better way if I focus on improving them separately in this week. 146 | 147 | I will also write an example app to display this module’s features and capabilities in this week. 148 | 149 | Motivation for GSoC: 150 | -------------------- 151 | 152 | I've always felt the need for a multi-platform system which would make it easier to develop for all platforms, while maintaining uniformity in the application. Kivy meets those expectations brilliantly. I specifically chose to work on the game engine KivEnt because I have been interested in game development since I started coding. I had also created a game of my own (with animated sprites, collisions, physics and everything) when I didn’t know something like game engines even existed! After that, I have tried a few other game engines for small projects. I especially like KivEnt because it is open source, written in Python (a language I am most comfortable with) and also because it can be used to deploy games to multiple platforms using Kivy’s capabilities. I hope to continue contributing to KivEnt and other Kivy projects after GSoC. 153 | 154 | Other Commitments: 155 | ------------------ 156 | 157 | My next academic semester starts in the last week of July. I will not be able to work full time except for weekends in the the last 4 weeks ( ~25th July to 23rd August). I have planned my timeline accordingly, spreading out work to be done in these weeks over a larger time interval. 158 | 159 | - **Extra Details:** 160 | - **University:** IIT Bombay 161 | - **Major:** Electrical Engg. 162 | - **Year:** 2nd, expected to graduate in 2019 163 | - **Degree:** B.Tech + M.Tech Dual 164 | - **Alternate Contacts:** 165 | - **Facebook:** [*facebook.com/udiboy1209*](https://facebook.com/udiboy1209) 166 | - **Hangouts:** [*plus.google.com/u/0/108170871312475118507*](https://plus.google.com/u/0/108170871312475118507) 167 | -------------------------------------------------------------------------------- /GSoC-2015/Accepted/HimanshuMishra-PSF-Implementing-Add-On-System-For-NetworkX/GSoC-2015-Application-Himanshu-Mishra-Add-on-System-for-NetworkX.md: -------------------------------------------------------------------------------- 1 | ## Table of Contents 2 | 3 | * [Basic Information](#basic-information) 4 | * [Personal Background](#personal-background) 5 | * [The Project](#the-project) 6 | * [Prototype](#prototype) 7 | * [Timeline (Tentative)](#timelinetentative) 8 | * [Notes](#notes) 9 | * [References](#references) 10 | 11 | --- 12 | # About Me 13 | 14 | ## Basic Information 15 | 16 | **Name:** Himanshu Mishra
17 | **University:** [Indian Institute of Technology, Kharagpur](http://en.wikipedia.org/wiki/Indian_Institute_of_Technology_Kharagpur)
18 | **Major:** Mathematics and Computing
19 | **Email:** himanshu2014iit@gmail.com
20 | **IRC:** orkohunter
21 | **Github:** [OrkoHunter](https://www.github.com/orkohunter)
22 | **Blog:** http://blog.himanshumishra.in
23 | **Blog RSS Feed:** http://blog.himanshumishra.in/feeds/posts/default/-/networkx
24 | **Timezone:** IST (UTC +5:30)
25 | 26 | ## Background work and Programming Skills 27 | 28 | I am a first year student of IIT Kharagpur, India. I'm pursuing a degree in Mathematics and Computing. I work on Linux Mint 17 LTS. I use Sublime Text for development and vim for SSH sessions. I am proficient in C and Python. 29 | 30 | Python easily lets me convert my ideas into code. I like it mainly because it is an interpreted language which gives you freedom to do so much things dynamically. Prototyping anything in Python is very easy and requires less man-work than any other programming language. I like IPython a lot. I have very great experience reading tutorials on IPython notebooks specially the ones for numpy. 31 | 32 | Since I started programming as a web developer, I used django all the time which made me familiar with Python in the beginning. 33 | 34 | I like `itertools` because of its magical functions which are written to handle iteration based algorithms. They are fast and use memory efficiently. 35 | 36 | I know how to use Git and Github. If I am stuck, I go to Google and always come back with a solution. 37 | 38 | # The Project 39 | ## The Problem and Motivation 40 | 41 | NetworkX being a pure python library provides a swift way for people to use it because python doesn't need a compiler, so everything goes smoothly on the python interpreter. In spite of that, it is realized that there also exists other graph packages written in other programming languages. While they being written in compiled languages, they offer rich functionalities whose reimplementation in Python is an important task. Some of the common fuctionalities between networkx and other packages have greater performance in them due to their use of compiled languages. This motivates us strongly to integrate them with NetworkX and make it more awesome. 42 | 43 | Related to the add-ons, in [#1325](https://www.github.com/networkx/networkx/issues/1325) it was suggested to remove the drawing package out of the NetworkX core package. The thought is that graph drawing should be more interesting for developing purposes for those developers who are interested in visual elements, external graphics packages rather than algorithms and data structures. Keeping it under NetworkX umbrella as `networkx-matplotlib` would make it more easier to develop and maintain. 44 | 45 | ## The Plan 46 | 47 | Integration doesn't seem to have a single and best way. There are mixed ideas discussed in [#1167](https://github.com/networkx/networkx/issues/1167) to choose the way of implementation. 48 | As I sum up, there are mainly two ways to do this. 49 | 50 | 1. Handling every add-on package in `networkx.external`. All of their necessary modules would reside in the respective sub-directory of the package in `networkx.external`. Then writing code to NetworkX that interfaces between NetworkX objects and bare lists/arrays expected by the add-on package. Decorators and namespaces can be handled in `networkx.utils`. And lastly, changing `setup.py` to handle installation related issues, the way Cython works. When Cython is not available, it would skip the installation part of the add-on and everything would work as if there is no add-on system for NetworkX. 51 | But this will lead to complication as all those code in NetworkX for the add-on packages and their modules itself would still remain there, unused and counterproductive. That's why we should go on for the other way. 52 | 53 | 2. Having add-ons as "optional components" under NetworkX umbrella. First of all NetworkX proper should not take those add-ons for granted that they would reliably exist. If so, then they should become an integral part of NetworkX alike NumPy and SciPy. Further, adding more add-ons should be as smooth as appending a list of add-ons. So, the idea is that `setup.py` should inspect and load configurations from `networkx.external.addons` and let the add-ons decide whether they can be installed or not. 54 | 55 | My work can be divided into three phases. 56 | 57 | #### Phase 1: Designing the approach for add-on implementation 58 | 59 | A separate repository under NetworkX umbrella on Github will be hosted as `networkx-` containing the add-on source files, tests and code files written for Cython. The python wrapper, however will be present in the NetworkX proper package. For example, a module named `partition` will be created in `networkx.addons.` to provide the partitioning functionality for NetworkX. 60 | 61 | Regarding `setup.py`, it will not be edited for each and every add-on to avoid the condition which will quickly inflate the file. Rather it will inspect `networkx.external.addons`, so the add-ons which are present, will be installed leaving no conflictions with those which are not present but are mentioned in `networkx.addons`. 62 | 63 | #### Phase 2: Working with two graph libraries and hosting the add-ons on Github 64 | 65 | I will have to show that the implemented add-on system works. For this I'll have to implement two officially sanctioned add-ons and host them on github. For this I choose the following two libraries- 66 | 67 | 1. METIS is a set of serial programs written in C, used for partitioning graphs, partitioning finite element meshes, and producing fill reducing orderings for sparse matrices. It is a truly amazing software. Some of its key features are High Quality Partitions, Extraordinary speed and production of efficient low fill orderings. 68 | 69 | 2. LEMON is graph library written in the C++ language which provides implementations of common data structures and algorithms with focus on combinatorial optimization tasks connected mainly with graphs and networks. LEMON is an abbreviation of Library for Efficient Modeling and Optimization in Networks. LEMON employs generecity in C++ by using [templates](http://en.wikipedia.org/wiki/Template_%28C%2B%2B%29). 70 | 71 | Both these add-ons will be hosted on Github under NetworkX umbrella. I plan to work on the respective forks. By the end of every week or two, there will be something to merge. 72 | 73 | #### Phase 3: Working with graph drawing package of NetworkX 74 | 75 | In this period, I will use the implemented add-on system to have the drawing package out of networkx main repository and be implemented as an add-on being hosted as a separate sub-repository under NetworkX umbrella as `networkx-matplotlib`. This will not be similar to implementing a C/C++ add-on as it will not require any wrapper or Cython. This will be a straight forward implementation of a python module as an add-on. 76 | 77 | Additionally I'll be writing tests side by side. Solving all the bugs in the last is really a bad idea as compared to not letting them emerge. However, if any hiccups still persists, I'll try to solve them before merging the final work. 78 | 79 | ## Prototype 80 | 81 | One of the prime issues of this project is how the NetworkX API is going to be used while interaction with the add-ons. How the implementation of add-ons are going to change the original NetworkX core package. Such things got discussed in [#1429](https://github.com/networkx/networkx/issues/1429). 82 | 83 | There should be clear distinction between the results obtained by using the NetworkX algorithms and their equivalent ones in the add-on. For instance, say I want to use `some_func()` over some_graph `G`. The prototype looks like this 84 | 85 | >>> import networkx as nx 86 | >>> from networkx.external.addons import lemon 87 | 88 | >>> G = nx.build_some_graph() 89 | >>> #Result using networkx 90 | >>> result_nx = nx.some_func(G, *args, **kwargs) 91 | >>> #Result using lemon 92 | >>> result_lemon = lemon.some_func(G, *args, **kwargs) 93 | 94 | In this way, any algorithm of any add-on need not to interfere with original algorithms of networkx. However, there arises a situation when someone is using networkx over their own code base which is pretty large. To them, it would be more handy to replace all the instances of networkx with algorithms of the add-on library. To meet this need, there can be something like `use_addon(addon.some_func)` in the local namespace. 95 | 96 | >>> import networkx as nx 97 | >>> from networkx.external.addons import lemon 98 | 99 | >>> G = nx.build_some_graph() 100 | >>> #Result using networkx 101 | >>> result_nx = nx.some_func(G, *args, **kwargs) 102 | 103 | >>> #Result using lemon 104 | >>> result_lemon = lemon.some_func(G, *args, **kwargs) 105 | 106 | >>> nx.use_addon(lemon.some_func) 107 | >>> #Now nx.some_func will use lemon. 108 | >>> result = nx.some_func(G, *args, **kwargs) 109 | 110 | Facilitating an add-on should be with more open options for the user. That's why both the APIs are favorable. 111 | 112 | While doing 113 | 114 | >>> from networkx.external.addons import lemon 115 | This does not indicate that all the add-on functions will reside in that `networkx.external.addons` namespace. Those will be provided along with the bindings in the separately hosted addon package, like `networkx-lemon` in this case. So, to use its functions, we need to import them into the local namespace. So, the `lemon.py` in `networkx.external.addons` would be looking like this: 116 | 117 | ``` 118 | try: 119 | from networkx-lemon import * 120 | except ImportError: 121 | print("Please install networkx-lemon to use this add-on") # Or maybe raise an exception? 122 | ``` 123 | 124 | ## Timeline(Tentative) 125 | 126 | **Community Bonding Period (27 April - 25 May)** 127 | 128 | **Goal**: Community Bonding 129 | 130 | * The principle focus in this period would be studying in detail, the functionalities of NetworkX, making notes, so as to compare them with that of the add-ons. 131 | 132 | * I'll ask guidance from my mentor upon the functioning of the add-on because that is the most crucial part of my project. 133 | 134 | * If possible, I would also start coding in this phase only, so that I get a head-start 135 | 136 | **Week 1 - Week 2 (25 May - 9 June)** 137 | 138 | **Goal**: Creating the add-on system 139 | 140 | * I'll create the general add-on system skeletal. I'll finish when NetworkX will be ready to be implemented with add-ons. 141 | 142 | * I'll also draw the skeletal of the sub-repositories. By this week, the first `networkx-` will be hosted on Github. 143 | 144 | **Week 3 - Week 4 (10 June - 26 June)** 145 | 146 | **Goal**: Implementing METIS 147 | 148 | * I'll start first with implementing METIS. I'll take [ysitu's](https://www.github.com/ysitu) [work](https://github.com/ysitu/networkx/commit/bd0cf745e327e8410290f5b8f21f98e1d836b59a) as my starting point. 149 | 150 | * In this period, I'll be writing Python wrappers of METIS graph partitioning functions. 151 | 152 | * I'll also be writing tests along with it 153 | 154 | **Mid term Evaluation** 155 | 156 | * Having fully functioning add-on sytem with one nontrivial add-on working properly. 157 | 158 | * Fix bugs if any 159 | 160 | **Week 5 - Week 7 (27 June - 17 July)** 161 | 162 | **Goal**: Modifying Lemon Graph Library for use 163 | 164 | * Lemon Graph Library has more varied functionalities. This period will be dedicated to studying and modifying the functions to be used. 165 | 166 | * As soon as I'll finish modifying the Lemon graph packages, I'll head forward to write code to NetworkX. 167 | 168 | **Week 8 - Week 10 (18 July - 7 August)** 169 | 170 | **Goal**: Writing code to NetworkX, Writing tests 171 | 172 | * In this period I'll be writing Python wrappers of Lemon Graph library functions 173 | 174 | * Finishing the add-on after writing tests 175 | 176 | **Week 11 - Week 12 (8 August - 21 August)** 177 | 178 | **Goal**: Working with NetworkX-Matplotlib 179 | 180 | * Remove NetworkX drawing package out of the proper package and implement it as a add-on for NetworkX while it being hosted on github under the umbrella of NetworkX. 181 | 182 | * Finishing/fixing bugs for existing add-ons so far. 183 | 184 | **Future Work -** Continue working over the add-on system forever. I'll love to see future implementations on it. 185 | 186 | I'll be writing my notes through out the progress. Later on I'll write IPython notebook tutorials for the work I'll have done. 187 | 188 | I would be able to devote 40 - 50 hours a week during the project, since I have no other big project devoted for the summer. My summer vacations start by 29 April and I'll not be engaged in vacations. My academic year would begin by July 17, but for the first month I would still be able to devote around 40 hours, since there would be no tests/exams. 189 | 190 | --- 191 | 192 | 193 | ### Notes 194 | 195 | * I have no commitment during summer which means I'm free to work completely on my project. 196 | 197 | * I am very enthusiastic about my work being beneficial to other people in the open source community. I'll keep on seeing the work done by me and fixing issues if they emerge in future. 198 | 199 | --- 200 | 201 | ### References 202 | 203 | * Overview of graph libraries 204 | http://lemon.cs.elte.hu/trac/lemon; 205 | http://glaros.dtc.umn.edu/gkhome/metis/metis/overview 206 | * Source codes http://people.sc.fsu.edu/~jburkardt/c_src/metis/metis.html; 207 | http://lemon.cs.elte.hu/trac/lemon/wiki/Downloads 208 | * Related issues https://github.com/networkx/networkx/issues/1167; 209 | https://github.com/networkx/networkx/issues/1325 210 | -------------------------------------------------------------------------------- /GSoC-2016/Accepted/PSF-MeetPragneshShah-RISCV-myHDL/PSF-MeetPragneshShah-RISCV-myhdl.md: -------------------------------------------------------------------------------- 1 | MyHDL: RISC-V Implementation 2 | ---------------------------- 3 | 4 | Student Info 5 | ------------ 6 | 7 | - Name : Meet Pragnesh Shah 8 | - City : Mumbai, India 9 | - Skype : meetshah1995 10 | - Email : meetshah1995@gmail.com 11 | - Alternate Email : meetshah1995@ee.iitb.ac.in 12 | - Github : [https://github.com/meetshah1995/](https://www.google.com/url?q=https://github.com/meetshah1995/&sa=D&ust=1461475401415000&usg=AFQjCNHdoqY63LlFnCvv3EO-JX6wv4V5pA) 13 | - GSoC Blog Feed URL : [http://myhdl-meetshah1995.blogspot.com/feeds/posts/default](https://www.google.com/url?q=http://myhdl-meetshah1995.blogspot.com/feeds/posts/default&sa=D&ust=1461475401416000&usg=AFQjCNHgtMl1WSROu7xHjZRlqel52_FKcQ) 14 | - Telephone : \ 15 | - IRC nick : meetshah1995 16 | - Primary Language : English 17 | - Time zone : GMT +5:30 18 | - Website : [http://meetshah1995.github.io](https://www.google.com/url?q=http://meetshah1995.github.io/&sa=D&ust=1461475401418000&usg=AFQjCNGaCSM2poUKLWMwq7XdO5QobftNyg) 19 | - Resume : \ 20 | 21 | ### Education 22 | 23 | - University: Indian Institute of Technology Bombay (IIT-Bombay) 24 | - Degree: B.Tech & M.Tech (Dual Degree) in Electrical Engineering 25 | - Current Year: 3rd (Junior) 26 | - Graduation Year: 2018 27 | - Minors: Computer Science and Operations Research 28 | - Master's Specialization: Communications and Signal Processing 29 | 30 | Patches Submitted 31 | ----------------- 32 | 33 | - Make the resync parameterizable sync\_reset(clock, reset\_in, reset\_out, synclen=2) and fifo\_syncers cleanup. 34 | 35 | - [https://github.com/cfelton/rhea/pull/8](https://www.google.com/url?q=https://github.com/cfelton/rhea/pull/8&sa=D&ust=1461475401423000&usg=AFQjCNFAjYfBqaqjMOBL8Pricj2crZvegw) 36 | 37 | - Currently working on increasing the test coverage for enum and always\_seq blocks. 38 | 39 | - [https://github.com/meetshah1995/myhdl](https://www.google.com/url?q=https://github.com/meetshah1995/myhdl&sa=D&ust=1461475401424000&usg=AFQjCNF5WfBI0x8cerMDCMeX3bk-CASeaw) 40 | 41 | Project Info 42 | ------------ 43 | 44 | ### Proposal Title 45 | 46 | - Title : RISC-V CPU and CPU design tools implementation in myHDL 47 | 48 | ### Proposal Abstract 49 | 50 | RISC-V is an open ISA freely available for all types of use. The RISC-V ISA has been designed with small, fast, and low-power real-world implementations in mind, but without "over-architecting" for a particular microarchitecture style. 51 | 52 | The RISC-V being a base ISA  is carefully restricted to a minimal set of instructions sufficient to  provide a reasonable target for compilers,  assemblers,  linkers,  and  operating  systems  (with additional supervisor level operations),  and so provides a convenient ISA and software toolchain skeleton around which more customized processor ISAs can be built. 53 | 54 | This project thus aims to leverage and demonstrate the advantages of myHDL and python in general in the field of CPU design by implementing a RISC-V CPU (in myHDL) and other CPU design utilities. Since RISC-V is a base ISA, having a myHDL based implementation becomes a essential and would enable a lot of Computer Architecture researchers to design and test RISC-V based derivatives using myHDL and python based utilities. 55 | 56 |   57 | 58 | ### Features and Deliverable Specifications 59 | 60 | The module features would be as follows on a very high level: 61 | 62 | - A robust, correct, rigorously tested implementation of a simple 32-bit CPU based on the RISC-V ISA in myHDL. 63 | - The modules (and corresponding tests) that shall be delivered by me would be : 64 | 65 | - RISC-V ISA decoder in pure python and myHDL.. 66 | - A simple 32-bit non-pipelined RISC-V based CPU. 67 | - A 3 stage pipelined design based on RISC-V. 68 | - Full 6 stage pipelined design based on RISC-V. 69 | - Utility tools for CPU design.     70 | 71 | - QA tests for developed for RISC-V module to check correctness & performance. 72 | - Tutorials and Documentation for the module and tools. 73 | 74 | ### Project Timeline 75 | 76 | This is merely a modest sketch. I have tried to be as lenient as possible in assigning the weekly tasks. The last 1-2 days of a Week will usually be reserved for code review and documentation or buffers to complete work which has not been completed in the allotted time. Though I intend to keep in touch with the mentor throughout the week and ensure that I am going in the right direction, I have tried to maintain a feasible feedback mechanism in the schedule so as to ensure that my mentor gives feedback on a reviewable quantity of code at appropriate intervals. 77 | 78 | Documentation : The documentation for each week's work shall be done by the end of that week only. In any case, any code shall not stay undocumented for more than 1-2 weeks. 79 | 80 | Tests : The tests for individual modules shall be developed using a working RISC-V core [3] and using intermediate outputs as stimulus to respective blocks and comparing the output with ground truth. The testing also will be done over a chain of blocks using respective ground truths and stimulus to test sequential correctness. The tests will also be carried out on FPGAs (Altera DE0) to test synthesizability. 81 | 82 | CPU Design Utilities : The present RISC-V repository provides a lot of CPU design utilities in shell [5] .As the project goes through, key utilities from the repository will be identified and implemented to support our implementation of the RISC-V CPU. A few other important utilities (like calculating CPI at variable workloads) will also be implemented. This is a non-exhaustive list and I will be implementing them as and when time permits, after GSoC as well. 83 | 84 | - Community Bonding period: 85 | 86 | - Read and brush up the relevant theory (RISC-V Instruction Set Architecture) needed to implement the module. 87 | - Get working knowledge of the of processor variants [3], modifications of which will be reused in implementing the RISC-V module. 88 | 89 | - Week 1 (6th May - 13th May): 90 | 91 | - Start with the decoder (both the python and myHDL) implementation. 92 | - Define the interface needed for the ISA decoding and simulation. 93 | - Writing the pseudocode for decoder module, laying out the classes and methods to be used which will help during implementation. 94 | - Write tests for the decoder module. 95 | - QA testing of the the decoder module 96 | 97 | - Week 2 (13th May - 20th May): 98 | 99 | - Begin implementation of the non-pipelined CPU. 100 | - Use existing datapath from a Verilog design and identify key modules. 101 | - Define the interface, classes and methods needed for the different modules [Register file, ALU, IR, encoders etc]  of the design taking help from existing designs. 102 | - Write the tests for individual processor modules. 103 | 104 | - Week 3 (20th May - 27th May): 105 | 106 | - Continue with the CPU implementation. 107 | - QA testing of the CPU. 108 | 109 | - Week 4 (27th May - 3rd June): 110 | 111 | - Begin implementation of the 3 staged pipelined CPU. 112 | - Use existing pipelined datapath from a Verilog design and identify key modules. 113 | - Define the interface, pipeline stages, classes and methods needed for the different modules [fetch, decode, execute etc.] of the design taking help from existing designs. 114 | - Identify the hazards and implement a Hazard Detection Unit [HDU]. 115 | - Write the tests for the individual processor modules 116 | 117 | . 118 | 119 | - Week 5 (3rd June - 10th June): 120 | 121 | - QA testing of the entire 3 staged processor. 122 | - Bug-squashing for errors in the processor implementation. 123 | - FPGA based testing of the processor. 124 | 125 | - Week 6 (10th June - 17th June): 126 | 127 | - Begin implementation of the 6 stage in-order pipelined CPU similar to the Rocket core [4]. 128 | - Use existing pipelined datapath from a design and identify key modules. 129 | - Define the interface, pipeline stages, classes and methods needed for the different modules of the design taking help from existing designs. 130 | - Write the tests for the individual processor modules 131 | 132 | - Week 7 (17th June - 24th June): 133 | 134 | - Generate tests for individual processor modules. 135 | - Continue the implementation and side by side unit testing. 136 | 137 | - Week 8 (24th June - 1st July): 138 | 139 | - QA testing of the entire processor module. 140 | - Exhaustive chat with mentors regarding mid-term evaluation and some parts of code that I might need to improve upon. 141 | 142 | - Week 9 (1st July - 8th July): 143 | 144 | - Testing the entire processor synthesized on a FPGA. 145 | - Identification of utilities to be implemented and discussion with mentor to finalize a list of them. 146 | 147 | - Week 10 (8th July - 15th July): 148 | 149 | - Implementing the CPU design utilities in python and myHDL. 150 | - Identifying and running comparisons tests to find the plus points of using myHDL and python over conventional CPU design tools. 151 | - Writing the tests for the utilities. 152 | - Exhaustive documentation and tutorial writing begins. 153 | 154 | - Week 11 (15th July - 22th July): 155 | 156 | - Documentation and Buffer to complete any remaining parts. 157 | - Try to port the entire RISC-V utils repo and experiment with out-of order and other architectural intricacies if things finish early. 158 | 159 | - Week 12 (22th July - 29th July): 160 | 161 | - Write tutorials explaining how to use the module and other utilities. 162 | - Ideally a week to complete any remaining documentation of the code. 163 | - Buffer week for analyzing the suggestions by the myhdl community and talking to the mentor regarding implementing the same. 164 | - Implementing the suggestions. 165 | - Final evaluation exhaustive chat with mentors and a feedback for the change in the code needed. 166 | 167 | Summer Availability 168 | ------------------- 169 | 170 | - Classes End Date: 15/04/2016 (dd-mm-yyyy) 171 | - Exams End Date: 05/05/2016 172 | - Work Hours per week: Approximately 42 hours per week or the amount of time required to finished the tasks allocated for the week (whichever is higher). 173 | 174 | Extra Information 175 | ----------------- 176 | 177 | ###     Background 178 | 179 | Being a electrical engineering student and interested in High Level Synthesis and VLSI, I think this is a very good opportunity for me to get hands on development experience in this field. I am definitely prepared to learn more about it in the summers and explore this interesting field. I think given my interests, I would like to add more functionalities to my-hdl post GSoC as and when time permits me. 180 | 181 | I designed two RISC Multi-cycle ([code](https://www.google.com/url?q=https://github.com/yashbhalgat/Multicycle-RISC-Processor&sa=D&ust=1461475401450000&usg=AFQjCNEpEyWvhnwTnHm4UXF8F2X-zLb_jg)) [[report](https://www.google.com/url?q=https://github.com/yashbhalgat/Multicycle-RISC-Processor/blob/master/report/Final_%2520Report_Meet_Navjot_Yash.pdf&sa=D&ust=1461475401451000&usg=AFQjCNG2EVfDVQtu_R7HzVKt0he5422kNQ)] and Pipelined processors ([code](https://www.google.com/url?q=https://github.com/navisngh11/Pipelined-RISC15-Processor&sa=D&ust=1461475401452000&usg=AFQjCNEKzav6ArrxwtX9IJmwRpRyIYGwBA)) [[report](https://www.google.com/url?q=https://github.com/navisngh11/Pipelined-RISC15-Processor/blob/master/Report/Pipelined_RISC_Report.pdf&sa=D&ust=1461475401453000&usg=AFQjCNHqjmSsTl0aLU4ASehca2HvegzRIw)] as a part of my EE309 Microprocessors course and have been through rigorous verilog exercises as a part of my EE224 Digital Systems Lab. I also interned at Computation Acceleration team at University of Illinois at Urbana-Champaign's Advanced Digital Sciences Center where in I developed an synthesizable arbitrary precision fixed and floating point library for their High Level Synthesis tool which also works with Vivado and Calypto.I thus feel that I have experience with the concepts and skills needed to complete this project. 182 | 183 | Coding Preferences 184 | 185 | I have been interested in Open Source programming since my freshman year at college. I was a part of the Autonomous Underwater Vehicle ([AUV-IITB](https://www.google.com/url?q=http://www.auv-iitb.org/Web/&sa=D&ust=1461475401455000&usg=AFQjCNE7R-aflCowrJ5pAi9srl4jIGMlIA)) team as a Software engineer where we developed the entire stack for our AUV Matsya (which reached the semi-finals at RoboSub in 2013, 2014 and 2015). I have also done a bunch of other parallel computing, computer vision and image processing like chess playing robot and written a lot of utility software including Django servers and Android apps. 186 | 187 | I code on my Ubuntu Desktop using vim as I think the availability of a large number of plugins for vim makes its customizable to suit my needs. I am comfortable with a lot of programming languages but I specifically like Python because of its intuitive interface and very broad spectrum of usage from simple scripting to web applications and scientific uses. One specific feature I like about Python is decorators :).   188 | 189 | MyHDL Example Code 190 | 191 | I have ported a few modules to myHDL from my earlier processor design in verilog and will be porting other modules soon. If time permits I will implement the entire RISC processor in myHDL.   192 | 193 | [https://github.com/meetshah1995/myhdl-examples](https://www.google.com/url?q=https://github.com/meetshah1995/myhdl-examples&sa=D&ust=1461475401458000&usg=AFQjCNEbX1xKSp2uEPwzTx7r6YNYX28Yqw) 194 | 195 | References 196 | ---------- 197 | 198 | [1](https://www.google.com/url?q=http://riscv.org/specifications/&sa=D&ust=1461475401458000&usg=AFQjCNGMApxZ5ri0-H8rthF8D-ErlJgNFQ) : RISC-V ISA, http://riscv.org/specifications/ 199 | 200 | [2](https://www.google.com/url?q=https://github.com/riscv&sa=D&ust=1461475401459000&usg=AFQjCNEBo1T8dD4ZIYm32HWw7g0RfX2C1g) : RISC-V Github organization and codebase,  https://github.com/riscv 201 | 202 | [3](https://www.google.com/url?q=https://bitbucket.org/casl/shakti_public&sa=D&ust=1461475401460000&usg=AFQjCNHbhwyxru6Mhp7T6d7Cs4vihU69YQ) : RISC-V based processors variants, https://bitbucket.org/casl/shakti\_public    203 | 204 | [4](https://www.google.com/url?q=https://github.com/ucb-bar/rocket&sa=D&ust=1461475401461000&usg=AFQjCNHRcQBPxnDWYL5Ys94O3pGb8kTAmw) : Rocket core, https://github.com/ucb-bar/rocket 205 | 206 | [5](https://www.google.com/url?q=https://github.com/riscv/riscv-tools&sa=D&ust=1461475401462000&usg=AFQjCNFDJVWge-zUbH7PqzgZkFk3uNUE_w) : RISC-V tools and utils, https://github.com/riscv/riscv-tools 207 | 208 | 209 | -------------------------------------------------------------------------------- /GSoC-2015/Accepted/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy/Pratyaksh-PSF-Sampling-Algorithms-in-pgmpy.md: -------------------------------------------------------------------------------- 1 | Sub-organization information 2 | ============================ 3 | 4 | - Sub-organization with whom you hope to work(\*): pgmpy 5 | 6 | Student Information 7 | =================== 8 | 9 | - Name(\*): Pratyaksh Sharma 10 | - Email(\*): psharma1707@gmail.com 11 | - Telephone(\*): +91 720 804 9809 12 | - Time zone(\*): GMT +5:30 13 | - IRC nick on freenode.net: pratyash 14 | - Source control username(s)(\*): Github: 15 | [pratyakshs](https://www.google.com/url?q=http://www.github.com/pratyakshs&sa=D&usg=AFQjCNHD_xB9RDGl5X8_w5p4UI7I0nXN7g) 16 | - Instant Messaging information: Google Hangouts: 17 | [psharma1707@gmail.com](mailto:psharma1707@gmail.com) 18 | - Any other ways we can reach you: Skype: psharma1707 19 | - Blog(s)(\*): 20 | [Blogger](https://www.google.com/url?q=http://pratyaksh-gsoc2015.blogspot.com&sa=D&usg=AFQjCNEukKJ24Sm8_0mIw8E_HZKVD1x6Gg) 21 | - GSoC Blog RSS feed(\*): 22 | [Link](https://www.google.com/url?q=http://pratyaksh-gsoc2015.blogspot.in/feeds/posts/default?alt%3Drss&sa=D&usg=AFQjCNEuVq2jvTdw5QrCqMndR1rFdpmK6Q) 23 | - Resume: 24 | [Link](https://www.google.com/url?q=https://www.dropbox.com/s/zxd4fib36pgy4f0/resume.pdf?dl%3D0&sa=D&usg=AFQjCNHKDrrw9ra7k84KI7ruGSyK3PEbuw) 25 | - Primary Language: English 26 | 27 | University Information 28 | ====================== 29 | 30 | - University(\*): Indian Institute of Technology, Bombay 31 | - Major(\*): Computer Science and Engineering 32 | - Current Year and Expected Graduation date(\*): Currently in 3rd 33 | year, expected to graduate in Aug 2016. 34 | - Degree(\*) (e.g. BSc, PhD): B.Tech. 35 | 36 | Project Proposal Information 37 | ============================ 38 | 39 | - *Proposal Title*(\*): pgmpy: Implementation of sampling algorithms for 40 | approximate inference in probabilistic graphical models 41 | - *Proposal Abstract* (\*): 42 | - Currently pgmpy supports algorithms like Variable Elimination and Belief Propagation for exact inference. In large graphs (particularly those having large treewidth), exact inference becomes computationally intractable. 43 | - Thus, there's a need for approximate algorithms which answer the inference query with a lower time complexity. 44 | - In this project I will implement the two most popular algorithms which fall under the more general class of  Markov Chain Monte Carlo methods 45 | 46 | - Gibbs Sampling. My implementation will follow closely the discussion in section 12.3.1-12.3.3 of [Koller]^[[1]](#ftnt1)^ 47 | - Metropolis–Hastings algorithm. Refer section 12.3.4 of [Koller]1 48 | - Will implement a general framework which will allow easy installation of more MCMC algorithms. 49 | - Basically, there are a variety of MCMC sampling methods. If time permits, I shall implement 50 | - Blocking Gibbs sampler 51 | - Collapsed Gibbs sampler 52 | - Rao-Blackwellisation 53 | - AND/OR importance sampling 54 | - Hamiltonian MCMC (becoming increasingly popular in recent times) 55 | 56 | References: 57 | 58 | 1. Chapter 12. Koller, Daphne, and Nir Friedman. Probabilistic 59 | graphical models: principles and techniques. MIT press, 2009. 60 | 2. B. Bidyuk, V. Gogate, R. Dechter: Tutorial on Sampling Techniques 61 | for Probabilistic and Deterministic Graphical models (AAAI-2010). 62 | [Link](https://www.google.com/url?q=https://www.ics.uci.edu/~dechter/talks/tutorial-aaai-2010.pdf&sa=D&usg=AFQjCNHLTeGOgjbfC2hG5RZpUEJ4T0GPMQ) 63 | 64 |          65 | 66 | Detailed description of algorithms 67 | ================================== 68 | 69 | - Sampling based estimation: 70 | 71 | -  Given a process to sample from a probability distribution 72 | ![](images/image00.png), we sample ![](images/image01.png). The 73 | ![](images/image02.png)’s are sampled independently and identically 74 | from ![](images/image00.png). 75 | - To estimate any function ![](images/image03.png) of 76 | ![](images/image00.png), we could use the result: 77 | 78 |                 ![](images/image04.png) 79 | 80 | - Monte Carlo algorithms are based on the fact that while it may not be feasible to compute expectations under ![](images/image05.png), it may be possible to obtain samples from ![](images/image05.png), 81 | or from a closely related distribution  such that marginals and other expectations can be approximated using sample-based averages. 82 | 83 | - Gibbs sampling is an example of a Markov chain Monte Carlo (MCMC) 84 | algorithm. In an MCMC algorithm, samples are obtained via a Markov 85 | chain whose stationary distribution is the desired 86 | ![](images/image05.png). The state of the Markov chain is a set of 87 | assignments of values to each of the variables, and, after a 88 | suitable “burn-in” period so that the chain approaches its 89 | stationary distribution, these states are used as samples 90 | - The Markov chain for the Gibbs sampler is constructed in a 91 | straightforward way: 92 | 93 | - at each step one of the variables ![](images/image06.png) is 94 | selected (at random or according to some fixed sequence), 95 | - the conditional distribution ![](images/image07.png) is computed, 96 | - a value ![](images/image08.png)is chosen from this distribution, and 97 | - the sample ![](images/image08.png)replaces the previous value of the 98 | ith variable. 99 | 100 | - The implementation of Gibbs sampling thus reduces to the computation 101 | of the conditional distributions of individual variables given all 102 | of the other variables. For graphical models, these conditionals 103 | take the following form: 104 | 105 |                  106 | 107 |         where ![](images/image09.png)denotes the set of cliques that 108 | contain index ![](images/image10.png). This set is often much smaller 109 | than the set ![](images/image11.png)of all cliques, and in such cases 110 | each step of the Gibbs sampler can be implemented efficiently. 111 | 112 | - When the computation of the above equation is overly complex, the Metropolis-Hastings algorithm can provide an effective alternative. Metropolis-Hastings is an MCMC algorithm that is not based on 113 | conditional probabilities, and thus does not require normalization. 114 | 115 | - Given the current state ![](images/image12.png) of the algorithm, Metropolis-Hastings chooses a new state ![](images/image13.png) from a “proposal distribution,” which often simply involves picking a 116 | variable ![](images/image14.png) at random and choosing a new value for that variable, again at random. The algorithm then computes the “acceptance probability”: 117 | 118 |         ![](images/image15.png)         119 | 120 | - With probability ![](images/image16.png) the algorithm accepts the proposal and moves to ![](images/image13.png), and with probability ![](images/image17.png) the algorithm remains in state ![](images/image12.png). For graphical models, this computation also turns out to often take the form of a simple message-passing algorithm. 121 | 122 | 123 | Timeline 124 | ======== 125 | 126 |         The following is merely a modest sketch. It is possible that if 127 | the allocated work for a week completes earlier than time, I shall start 128 | the next week’s tasks. On the contrary, a week’s tasks may spill over to 129 | the next. This timeline is quite a lenient one for me. I believe I will 130 | end up exceeding my promises 131 | 132 | - Week 1: 133 | 134 | - Implement Forward Sampling (FS), Rejection Sampling (FS) 135 | - Tune parameters such as: 136 | - Number of samples required to estimate a distribution. 137 | - Deliverables: A method which takes as arguments a Bayesian Network 138 | and an evidence set and returns the distribution P(Y | E=e), where Y 139 | is the set of unobserved variables and E is the evidence set. 140 | 141 | - Week 2: 142 | 143 | - Implement Importance Sampling (IS) 144 | - Write tests+documentation for FS, RS and IS 145 | 146 | - Week 3: 147 | 148 | - Implement a general purpose Markov Chain Monte Carlo class which has 149 | methods that take as input a Markov Chain (with corresponding 150 | transition probability matrix) and simulates a run of the same. 151 | - Write tests+documentation 152 | 153 | - Week 4: 154 | 155 | - Write functions to compute various statistics  which compares 156 | probabilities across different initializations and across different 157 | windows in the same run. 158 | - Tune parameters (window size, various thresholds) for deciding if 159 | the distribution has converged to a stationary one. 160 | - Deliverables: An MCMC class with methods to sample from it and 161 | report various statistics to decide mixing. 162 | - Write tests+documentation 163 | 164 | - Week 5: 165 | 166 | - Implement plots/charts etc. for visualizing the statistics computed 167 | in week 4. 168 | 169 | - This will help the user decide if the distribution has converged. 170 | 171 | - Write tests+documentation 172 | 173 | - Week 6: 174 | 175 | - Within the MCMC class, write functions to generate the Gibbs chain, 176 | given a Bayesian Network or Markov Network. 177 | - Implementing the transition model. 178 | - Write tests+documentation 179 | 180 | - Week 7: 181 | 182 | - With the Gibbs chain, write a function to sample a target Gibbs 183 | distribution 184 | - Deliverables: A method to estimate the Gibbs distribution of a 185 | bayesian network or a markov network. 186 | 187 | - Week 8: 188 | 189 | - Write tests for inference using Gibbs sampling. 190 | - Analyse performance 191 | 192 | - Compare (time/space efficiency+accuracy) with exact algorithms 193 | - Find performance bottlenecks (a compact representation of states is 194 | absolutely necessary). 195 | 196 | - Week 9: 197 | 198 | - Implement the framework for Metropolis-Hastings algorithm 199 | - Write methods to initialize the proposal distribution and acceptance 200 | distribution 201 | - Write an interface for inference which supports the various 202 | algorithms in pgmpy for both undirected and directed models. 203 | - Deliverables: Methods supporting various inference (approximate) 204 | queries on graphical models. 205 | 206 | - Week 10: 207 | 208 | - Tests+debugging+documentation for Metropolis-Hastings 209 | - Do a performance analysis + look for improvements 210 | - Buffer time 211 | 212 | - Week 11: 213 | 214 | - Write an interface for inference which supports the various 215 | algorithms in pgmpy for both undirected and directed models. 216 | - Deliverables: Methods supporting various inference (approximate) 217 | queries on graphical models. 218 | - Write documentation+tests 219 | 220 | - Week 12: 221 | 222 | - Complete the integration 223 | - Complete documentation 224 | - Test edge cases 225 | - Buffer time 226 | 227 | The documentation for each week’s work shall be done by the end of that 228 | week only. In any case, any code shall not stay undocumented for more 229 | than 1-2 weeks. 230 | 231 | The documentation shall also contain ipython notebook(s) which will form 232 | part of the tutorial. 233 | 234 | Link to a patch/code sample 235 | =========================== 236 | 237 | -  I submitted two pull requests, both of which have been merged. 238 | 239 | - Fixed a bug, which was causing issues with the latest version of 240 | networkx (1.9.1) 241 | [https://github.com/pgmpy/pgmpy/pull/346](https://www.google.com/url?q=https://github.com/pgmpy/pgmpy/pull/346&sa=D&usg=AFQjCNHcanfLKaasd2rD4FLwUawZLK37cw) 242 | - Added functionality to compute induced graph 243 | [https://github.com/pgmpy/pgmpy/pull/349](https://www.google.com/url?q=https://github.com/pgmpy/pgmpy/pull/349&sa=D&usg=AFQjCNGYNrWdn5SeUs3lJHKmemQzMJA0vA) 244 | 245 | Time Commitment 246 | =============== 247 | 248 |         Approximately 40 hours per week or the amount of time required 249 | to finished the tasks allocated for the week (whichever higher). 250 | 251 | Background 252 | ========== 253 | 254 | I was introduced to graphical models during my second year, from the 255 | coursera course on the same. During May-July 2014, I worked at the 256 | [Graphical Model 257 | Algorithms](https://www.google.com/url?q=http://graphmod.ics.uci.edu/group&sa=D&usg=AFQjCNEhDa78y8P1qyApkIfVamVK6-hRHw) group 258 | at UC Irvine, under Prof. Rina Dechter. Rina (and her group) has been at 259 | the forefront of research in PGMs. The winning solver at [Pascal 260 | approximate inference competition (at UAI 261 | 2012)](https://www.google.com/url?q=http://www.auai.org/uai2012/pascal.shtml&sa=D&usg=AFQjCNFEwz3HwZQ11egcZDZU_JIGKqpTNQ) was 262 | developed at Rina’s group. I worked on weighted best-first search 263 | algorithms for anytime approximate inference in PGMs. My contribution 264 | was to implement the algorithm and compare its performance with the 265 | branch and bound methods (which is the state of the art). The [paper we 266 | wrote](https://www.google.com/url?q=http://www.ics.uci.edu/~dechter/publications/r216.pdf&sa=D&usg=AFQjCNF_R3H5S6GfmIX15a1Ok9r6Rw3Enw) was 267 | accepted at [PlanSOpt workshop at 268 | AAAI’15](https://www.google.com/url?q=http://org.nicta.com.au/plansopt-15-aaai-workshop-planning-search-optimization/&sa=D&usg=AFQjCNGB4GBRUWXEwOfTTr0wJjwOMN8ZBQ). 269 | I also participated in the [PASCAL approximate inference challenge at 270 | UAI 271 | 2014](https://www.google.com/url?q=http://www.hlt.utdallas.edu/~vgogate/uai14-competition/leaders.html&sa=D&usg=AFQjCNE1zWiGhxK7VPmqESeIT0WsFfABMg), 272 | submitting the solver that we developed. 273 | 274 | *Why pgmpy?* 275 | 276 | During my time at UC Irvine, I strongly felt the need of a robust 277 | library for PGMs. The 278 | 279 | existing codebase that they used was poorly documented and it took me a 280 | lot of time just to get a hang of it. Around December 2014, I stumbled 281 | upon pgmpy. I found that the organisation participates in GSoC. The 282 | codebase, though quite limited, seemed decently documented. The fact 283 | that the whole project is written in python made it easier for me to 284 | understand (the code at UC Irvine was written in C++). Basically, I feel 285 | a PGM library was desperately needed and I think I’ll have fun expanding 286 | it. 287 | 288 | * * * * * 289 | 290 | [[1]](#ftnt_ref1) Koller, Daphne, and Nir Friedman. Probabilistic 291 | graphical models: principles and techniques. MIT press, 2009. 292 | -------------------------------------------------------------------------------- /GSoC-2024/accepted/ToL-RohanBarsagade-Kinfin_Integration/ToL-RohanBarsagade-Kinfin_Integration.md: -------------------------------------------------------------------------------- 1 | # Project Proposal 2 | 3 | ## Integrating KinFin Proteome Cluster analyses into Genome Browsing environments 4 | 5 | - **Organisation** : Wellcome Sanger Tree of Life 6 | - **Mentor** : Dr. Richard Challis 7 | - **Applicant** : Rohan Rajendra Barsagade 8 | 9 | # PERSONAL INFORMATION 10 | 11 | ## CONTACT INFORMATION 12 | 13 | - **NAME** : Rohan Rajendra Barsagade 14 | - **EMAIL** : rbarsagade101@gmail.com 15 | - **GITHUB** : [rohan-b-84](https://www.github.com/rohan-b-84) 16 | - **LINKEDIN** : [r-barsagade](https://www.linkedin.com/in/r-barsagade) 17 | - **WEBSITE** : [https://rbarsagade.tech](https://rbarsagade.tech) 18 | - **LOCATION** : Kharagpur, West Bengal, India 19 | - **TIMEZONE** : IST (UTC +5:30) 20 | 21 | ## STUDENT AFFILIATION 22 | 23 | - **UNIVERSITY** : Indian Institute of Technology Kharagpur 24 | - **DEGREE** : Bachelors of Technology (B.Tech.) 25 | - **MAJOR** : Biotechnology & Biochemical Engineering (with micro-specialization in Artificial Intelligence and its Applications) 26 | - **EXPECTED GRADUATION** : 2025 27 | 28 | ## BRIEF BIO 29 | 30 | I am Rohan R. Barsagade, a third-year undergraduate student at the Indian Institute of Technology Kharagpur (India). I am interested in exploring innovative ways to analyse and interpret biological data, particularly in the field of systems biology and bioinformatics, and leverage the potential of computational tools to speed up biological research. 31 | 32 | My programming language skill set spans Python, C, C++, Golang, and Javascript. 33 | 34 | Through my involvement in open-source projects, I have come to appreciate the importance of well-documented, readable, and maintainable code, especially at the intersection of biotechnology and open-source software development, where freely available code empowers researchers around the world. It is this focus on clean, clear, and reusable code that has drawn me to the KinFin project. 35 | 36 | # PROJECT DESCRIPTION 37 | 38 | ## ABSTRACT 39 | 40 | KinFin is a computational tool designed for the analysis and visualisation of protein family clusters. However, KinFin currently lacks integration with web-based genome browsing systems such as Ensembl and GenomeHubs. This project proposes an initiative aimed at facilitating access to KinFin's functionalities within the context of these genome browsing platforms via a two-phased approach to address this gap. 41 | 42 | ## BACKGROUND 43 | 44 | Understanding protein families, groups of proteins that share ancestral lineage and functional characteristics, is vital for biology. Bioinformatics tools like OrthoFinder automate identification of these groups, but visualising the massive data they generate remains a challenge. KinFin is a tool designed to analyse protein families, offering valuable insights, but integrating it with genome browsers is challenging, primarily due to the complexity of KinFin's data files and their incompatibility with genome browser formats. 45 | 46 | Kinfin's current workflow is fragmented. Researchers need to use several separate scripts for tasks like preprocessing, leading to a slow and disjointed experience. 47 | 48 | Additionally, KinFin results are presented in different file formats, making it difficult for researchers to explore their data effectively due to incompatibility with genome browsing environments. This incompatibility with genome browsers restricts researchers from utilising all the information at their disposal. 49 | 50 | Establishing a direct pipeline and API integration between KinFin and genome browsers could resolve the existing challenges. This integration would improve the workflow, enhance data accessibility, and enable intuitive visualisation of protein families within their genomic context. By combining the capabilities of KinFin and genome browsers, researchers would greatly benefit from leveraging the strengths of both platforms, facilitating their research work. 51 | 52 | ## PROJECT PROPOSAL 53 | 54 | ### OVERVIEW 55 | 56 | This proposal outlines a **two-phased approach** to integrate KinFin Proteome Cluster analyses into Genome Browsing environments: 57 | 58 | - **Phase 1**: Phase 1 will focus on refactoring the KinFin code-base to improve its code maintainability, readability, and testability and make the way for a web-based integration. 59 | 60 | - **Phase 2**: Phase 2 will establish the KinFin as a Service (KaaS) API which can integration with genome browsing platforms like GenomeHubs 61 | 62 | #### Technologies 63 | 64 | The primary technology used will be **Python** and **Git**.  65 | Within Python, I am thinking of using FastAPI as the specific framework for the API implementation (Phase 2). This is because FastAPI provides high-performance request handling and also handles automatic API documentation generation. 66 | 67 | ### PLAN OF ACTION 68 | 69 | #### Phase 1: Refactoring the KinFin code-base 70 | 71 | Currently, KinFin's code-base is primarily in a single large Python file `kinfin.py` within the `/src` directory. This file contains **various classes, data manipulation methods, and utility functions exceeding 2000 lines**. Additionally, the project structure **includes build and distribution files** alongside example input files, scripts, and dependency management files. The code-base also has **some Python 2 syntax**, which is no longer recommended. 72 | 73 | Phase 1 will focus on refactoring the code for maintainability, readability, and testability. 74 | 75 | ![_**Image**: Screenshot from `kinfin.py` file showing multiple utility functions and classes defined_](media/util_classes_functions.png) 76 | 77 | _**Image**: Screenshot from `kinfin.py` file showing multiple utility functions and classes defined_ 78 | 79 |
80 | 81 | ![_**Image**: Screenshot `from filter_isoforms_based_on_gff3.py` file showing outdated Python2 syntax_](media/outdated_py2_syntax.png) 82 | 83 | _**Image**: Screenshot `from filter_isoforms_based_on_gff3.py` file showing outdated Python2 syntax_ 84 | 85 | ##### Phase 1 Goals: 86 | 87 | - Refactor KinFin into smaller, well-defined modules for improved code maintainability and comprehension 88 | - Enable compatibility with external applications like genome browsing environments by making KinFin's output consumable by them (e.g., JSON, CSV formats) 89 | - Ensure core KinFin functionality remains intact throughout the refactoring process 90 | 91 | ##### Phase 1 Tasks: 92 | 93 | 1. Update the code-base and migrate to Python 3 syntax for wider compatibility. 94 | 2. Breakdown monolithic `kinfin.py` file into smaller, independent, and reusable modules which modules can then be imported and utilised as needed. 95 | 3. Decouple application logic from data manipulation routines to enable integration within both a Command-Line Interface (CLI) and API environments. 96 | 4. Refactor KinFin's output generation to support both CLI and API needs. 97 | 5. Remove unnecessary build and distribution files from the version control system and introduce a `.gitignore` file to manage the exclusion of specific files from version control. 98 | 99 | ##### Phase 1 Deliverables: 100 | 101 | - A well-structured KinFin code-base with modularized components 102 | - Code-base migrated to Python 3 for improved compatibility and adherence to modern practices 103 | - Refactored logic for stateless operation, enabling API integration 104 | - Flexible output generation capabilities, supporting both CLI application and API (JSON) formats 105 | - Improved project management with the implementation of a `.gitignore` file 106 | - Documentation on the revised project structure 107 | 108 | ##### Phase 1 Benefits 109 | 110 | - Enhanced code maintainability, readability, and testability due to modularization. 111 | - Broader compatibility with external applications due to Python 3 migration 112 | - Stateless architecture and flexible output format for easy scalability and flexibility 113 | - Improved project organisation and management through version control best practices 114 | 115 | #### Phase 2: Implementing KaaS API Architecture 116 | 117 | The second phase of this project focuses on establishing a KinFin-as-a-Service (KaaS) API, paving the way for integration with Genome Browsing environments like GenomeHubs. 118 | 119 | Currently, KinFin analysis relies on a series of independent scripts for data preprocessing, configuration, and core analysis functionalities. Users navigate through a sequence of scripts, making decisions about data preprocessing (filtering, isoform removal, etc.) and functional annotation retrieval (InterProScan) at each step based on their requirements. This approach necessitates managing output and config files at the end of each step and navigating complex user flows. 120 | 121 | Designing the KaaS API will involve analysing the user flow and identifying critical decision points within the user flow (as shown in the flowchart). These points would represent functionalities currently controlled by the requirements of the user and the configuration file and would be the key to determine the script execution sequences. 122 | 123 | The KaaS API will be designed to capture these decision points as API parameters. This will allow users and applications to specify desired functionalities (e.g., data preprocessing, annotation retrieval) without relying on complex configuration files. Users would submit data and configuration parameters directly through the API, receiving analysis results in a standardised format like JSON. This would simplify the workflow and facilitate integration with external systems. 124 | 125 | To ensure a robust and maintainable API, we will initially prioritise essential features like data submission, core KinFin analysis, and result delivery in JSON format. Subsequently, it can then be upgraded to systematically integrate additional user configuration options based on mentor’s feedback and project priorities. 126 | 127 | While the API is the primary focus of Phase 2, the KinFin script itself will not be abandoned. We will introduce a `--serve` flag to the script. This flag, when used, will deploy the KinFin service, making it accessible through API calls. However, the script's existing functionality with its traditional arguments will be preserved. This would ensure continuity for existing users who prefer a command-line workflow and provides a familiar interface alongside the new API access. 128 | 129 | ![_**Figure**: User Flow devised based on KinFin documentation_](media/flowchart.png) 130 | 131 | _**Figure**: User Flow devised based on KinFin documentation_ 132 | 133 | ##### Phase 2 Goals: 134 | 135 | - Develop a well-defined API architecture to expose KinFin's functionalities as a web service 136 | - Implement a Minimum Viable Product (MVP) of the KaaS API with essential functionalities for basic KinFin analysis 137 | - Progressively enhance the KaaS API to encompass the full range of KinFin's configuration options 138 | - Maintain the existing CLI functionality for users who might prefer a command-line workflow. 139 | 140 | ##### Phase 2 Tasks: 141 | 142 | 1. Define the API structure, including endpoints, data models, request parameters, and response formats 143 | 2. Develop the initial API endpoint to handle basic cluster data analysis. Users can submit essential data like clustering data and sequence IDs through the API and receive KinFin's core analysis results in JSON format. 144 | 3. Progressively expand the API functionalities by creating additional endpoints that mirror the complete KinFin user flow (represented by the flowchart). Each endpoint will address specific user decisions and data manipulations, like data pre-processing, isoform removal, functional annotations, and configuration options. 145 | 4. Integrate the API functionalities within the existing KinFin script. A `--serve` flag will allow users to deploy the KinFin service, making it accessible through API calls. Traditional CLI functionality using existing arguments will still be available. 146 | 5. Develop comprehensive API documentation to guide users on endpoint usage, data formats, and expected responses. Implement robust error handling mechanisms to provide informative messages in case of issues. 147 | 148 | ##### Phase 2 Deliverables: 149 | 150 | - A well-documented API specification outlining API endpoints, request/response structures, and data formats. 151 | - Functional API endpoints that provide KinFin’s protein cluster data analysis 152 | - Progressively implemented API endpoints mirroring the complete KinFin user flow. 153 | - A KinFin script with integrated API capabilities, allowing users to choose between CLI and API access using the `--serve` flag 154 | - Comprehensive API documentation and error handling mechanisms. 155 | 156 | ##### Phase 2 Benefits: 157 | 158 | - Increased accessibility to KinFin functionalities through a user-friendly API. 159 | - Preserved CLI functionality to ensure continuity for existing users. 160 | - Integration with Genome Browsing environments like GenomeHubs. 161 | - Improved user experience with clear documentation and informative error handling. 162 | 163 | ### TIMELINE 164 | 165 | #### Timeline / Milestones 166 | 167 | - **Week 1**: In the first week, I will focus on migrating from Python 2 to Python 3 and updating the repository to include build and dist files in gitignore. Additionally, I will update the readme to include instructions on how to build the service. 168 | - **Week 2 and 3**: After migrating to Python 3, I will work on modularizing the KinFin code-base. This involves restructuring the repository to organise utility functions and class definitions into separate files, making them easier to manage and use. 169 | - **Week 4 and 5**: Once the initial organisation is complete, I will begin establishing the architecture for the KaaS service. This will involve transitioning from a stateful to a stateless implementation wherever possible. 170 | - **Week 6**: I will focus on documenting all the changes made during the modularization and architecture setup. I will also update the plans for the different KaaS endpoints if necessary. 171 | - **Week 7 and 8**: In Week 7, I will begin implementing the KaaS service by installing necessary dependencies for a basic FastAPI architecture and planning the API design. By Week 8, I aim to have a basic API endpoint implemented, returning analysis data in JSON format that can be utilised by genome browsers for visualisation. 172 | - **Week 9 to 11**: During the Weeks 9, 10, and 11, I will focus on expanding the basic endpoints to include various customizable analysis options provided by KinFin. This will be done progressively, starting with basic endpoints and gradually incorporating more advanced configuration options. 173 | - **Week 12**: Week 12 will involve project wrap-up, including testing, error handling, and completion of any remaining tasks. 174 | 175 | #### Commitments 176 | 177 | During weekdays (Monday through Friday), I will be able to dedicate **3-5 hours** to project work, ensuring alignment with UK working hours. This allows for efficient communication and collaboration. To compensate for the reduced weekday schedule, I'm happy to invest additional hours on weekends (Saturday and Sunday) – typically **7-8 hours** per day. This extended weekend commitment ensures I meet the proposed timeline while respecting the time zone difference. 178 | 179 | #### Additional Information about the Timeline 180 | 181 | - The timeline mentioned above is subject to change and is only an approximate outline of my project work. I will stick to or exceed this schedule and create a more detailed schedule during the pre-GSoC and community bonding phase. 182 | - I've no other commitments during the summer and can dedicate 30 to 35 hours a week. During the last month of the project, my college will begin, and I'll be able to commit a max of 20 a week. Due to the same, I will do a significant portion of the work before this period. 183 | - Time will be divided (according to workload) each week amongst planning, learning, coding, documenting and testing features. All documentation will go hand in hand with the development. 184 | 185 | ### POST GSOC PLANS 186 | 187 | I am committed to the long-term sustainability and success of the KinFin project. While the initial scope of this proposal focuses on users providing the clustering data from OrthoFinder and uploading necessary files, I recognize the potential for further enhancing user experience. Time permitting, I would like to explore the possibility of allowing users to submit only accession numbers of the protein sequences they wish to analyse. This would streamline the user workflow by eliminating the need to manage and upload individual files. Additionally, the possibility of developing a dedicated user interface for the KinFin service can be explored to further enhance user experience. 188 | -------------------------------------------------------------------------------- /GSoC-2014/Accepted/Mozilla-ManishG-Servo/Mozilla-ManishG-Servo.md: -------------------------------------------------------------------------------- 1 | Abstract: Currently, Servo has no AJAX support exposed to JavaScript. AJAX is an integral part of the web these days, so the lack of this feature makes Servo much less useful as a browser. The goal of this project is to have a partial implementation of the XMLHttpRequestobject -- one which is self-consistent and covers most use cases that are found today. 2 | 3 | ### Personal Details 4 | 5 | - Name: Manish Goregaokar 6 | 7 | - Email: something 8 | 9 | - IRC nick on irc.mozilla.org: Manishearth 10 | 11 | - Telephone: something 12 | 13 | - Other contact methods: Google Talk (IRC preferred for short communications) 14 | 15 | - Country of residence: India 16 | 17 | - Timezone: IST (GMT + 0530) 18 | 19 | - Primary language: English 20 | 21 | ### Project Proposal 22 | 23 | The final goal is to partially implement the XMLHttpRequest and associated helper objects. The following portions of the protocol will be implemented (full spec [*here*](http://xhr.spec.whatwg.org/)): 24 | 25 | - open() method: Both sync and async arguments, GET/POST requests. No support for HTTP auth. 26 | 27 | - send() method: Initially focus on simple GET sending, then move on to application/x-www-form-urlencoded and text/plain (if possible, multipart/form-data). If there is time (not necessarily during the summer), expand to Blobs and other types. 28 | 29 | - Some work for this could be done by expanding rust-http (specifically RequestWriter or a wrapper of the same), which currently has no distinction between head[er and postdata. ](http://www.google.com/url?q=http%3A%2F%2Fmxr.mozilla.org%2Fservo%2Fsource%2Fsrc%2Fcomponents%2Fnet%2Fresource_task.rs&sa=D&sntz=1&usg=AFQjCNEEQB1YVmFd8kNPrfSRO1Ot7LErQg)There’s [*a pending issue*](https://www.google.com/url?q=https%3A%2F%2Fgithub.com%2Fchris-morgan%2Frust-http%2Fpull%2F23&sa=D&sntz=1&usg=AFQjCNEN1xNF042tWbPj4ZR02jwl6BK0Ig) on this.[](http://mxr.mozilla.org/servo/source/src/components/net/resource_task.rs) 30 | 31 | - [*resource\_task.rs*](http://www.google.com/url?q=http%3A%2F%2Fmxr.mozilla.org%2Fservo%2Fsource%2Fsrc%2Fcomponents%2Fnet%2Fresource_task.rs&sa=D&sntz=1&usg=AFQjCNEEQB1YVmFd8kNPrfSRO1Ot7LErQg) contains code that handles forming tasks for HTTP requests. It seems to be easy to make async (in a sense it already is). Header support will need to be added, as well as postdata support after the rust-http changes are made. 32 | 33 | - For suporting application/x-www-form-urlencoded , we can use [*form\_urlencoded.rs*](https://github.com/SimonSapin/rust-url/blob/master/form_urlencoded.rs) from rust-url. 34 | 35 | - For multipart/form-data, [*formdata.rs*](https://github.com/mozilla/servo/blob/master/src/components/script/dom/formdata.rs) can be extended so that a FormData object can be loaded and converted to text. The goal is to have something like [*this*](https://developer.mozilla.org/en-US/docs/Web/Guide/Using_FormData_Objects) work (for strings), since some AJAX on the web is done this way. 36 | 37 | - I will try to implement as much as possible of this separately from the XHR code, this will be useful for forms later. I plan to look into making text based forms (sans UI) work before the summer starts. 38 | 39 | - Essentials like readyState, statusText, responseText must be implemented. 40 | 41 | - Allow for setting timeouts, custom headers, and fetching of headers from a request. 42 | 43 | - Events: onreadystatechange, onload, onerror, ontimeout, onabort. Investigate necessity of onprogress and implement if necessary. 44 | 45 | Additional components of the proposal are: 46 | 47 | - A testsuite that covers most of the implemented work, which is easy to extend. Use [*wptserve*](http://wptserve.readthedocs.org/en/latest/) or similar for generating responses. This will basically spawn instances of wptserve from the usual content test engine, run tests, and collect results. 48 | 49 | - Try to make jQuery's $.get,$.post,$.ajax work with the implemented XHR object. Nowadays most of the AJAX on the Web is done by proxy of jQuery. In parallel with the XHR work (or before the summer), I can attempt to create a dummy XHR object in JS, identify, and try to fix any issues with jQuery on Servo. 50 | 51 | ### Schedule of Deliverables 52 | 53 | My summer vacation is from May 1 to (approximately) July 21 (as opposed to GSoC’s May 19-Aug 18) . This period is slightly shorter (though it starts early), however I can also work during the autumn semester until the official end of GSoC. I will be unable to devote a full 5-7 hours per day for this last part, it should realistically be around 1-2 hours on weekdays and 5-9 hours on weekends. 54 | 55 | I also plan to get used to the codebase during this semester, so I doubt that the shift in timings should be an issue. 56 | 57 | I’ve tried to structure the timeline so that there are fewer weeks which are purely for coding or purely for learning. The last 1-2 days of a week will usually be reserved for code review and pull requesting (if necessary), though I intend to keep in touch with the mentor throughout the week and ensure that I’m going in the right direction. I’ll try to pull request consistent portions of the project bit by bit, instead of submitting it all at once -- given the frequency of rust upgrades, and the instability of other modules, it’s best to keep up with the rest of the code. Since the testing framework won’t be in a commitable state until later, I intend to keep a makeshift testcase in the pull requests themselves. 58 | 59 | Timeline: 60 | 61 | - (Before May 1): 62 | 63 | - Write some fake XHR objects that return dummy data. These will contain only the features that the proposal plans to implement. I can use this with jQuery on Chrome and see if it’s enough to make jQuery AJAX work; the results of these tests will let me know if there are additional features that my proposal will need. Then, I can test it with Servo and note down what JS features need to be implemented, and file issues for the same. jQuery is a rather complex library -- the goal here is not to get the entire library to work, but to fix enough of the library so that the basic AJAX methods work. 64 | 65 | - Try to get some experience in the relevant portions of the codebase by partially implementing forms. This is not part of the proposal, but if implemented it would be significantly easier to implement POST requests. 66 | 67 | - May 1 - May 4: Discuss project implementation in detail with mentor and knowledgeable community members. Focussed studying of spec (as well as [*Firefox*](http://mxr.mozilla.org/mozilla-central/source/dom/workers/XMLHttpRequest.cpp)[*’s implementation*](http://mxr.mozilla.org/mozilla-central/source/dom/workers/XMLHttpRequest.cpp)), and comparison with implemented code for other objects. Ideally this will be done beforehand, but I’m leaving this time period in here just in case. Additionally, I can learn more about how tasks from script\_task and resource\_task work. 68 | 69 | - May 4 - May 11: Start coding. Work on prerequisite framework: 70 | 71 | - Tweak resource\_task by adding an async method. 72 | 73 | - If the rust-http issue has not yet pushed through, work on a simpler fix that adds a simple send() to RequestWriter. This might take longer than a week, since tests will need to be written and code will need to be reviewed. Since this is not entirely necessary for the first part, it can be done in parallel. 74 | 75 | - Pull request for the above two points (individually) 76 | 77 | - Read [*wptserve*](http://wptserve.readthedocs.org/en/latest/) docs and formulate a plan for testing. We’ll need to figure out what kinds of tests we want, and how we will structure the tests. (I have already thought up a bit about the structure of the framework, more details below) 78 | 79 | - May 11 - May 18: XHR basics 80 | 81 | - Add the necessary webidls and codegen changes for the XMLHTTPRequest-related objects. Write an impl for the object that contains dummy methods (similar to most of the unimplemented html\*element.rs files). 82 | 83 | - Implement a basic open() that just preps the object without doing anything. 84 | 85 | - Pull request for the above two points 86 | 87 | - Start work on the testing framework. This involves writing modifying the content test runner so that it spawns a dummy wptserve instance which is able to return various responses from the test directory. 88 | 89 | - May 18 - May 25: Implement send() for GET. Unless resource\_task makes this problematic, try to get both sync and async working. Test against localhost (SimpleHTTPServer or Flask as makeshift testers), the testing framework probably won’t be ready till now. Support redirects and other similar cases, as well as some basic HTTP errors. responseText should work as well. Pull request for the above, with details on the makeshift testing method 90 | 91 | - May 25 - June 8 (2 weeks): 92 | 93 | - Handle onreadystatechange. This might have to be tested via addEventListener() depending on the implementation status of EventHandler (I plan to look into EventHandler during the current semester). 94 | 95 | - Add header / content support to [*resource\_task.rs*](http://mxr.mozilla.org/servo/source/src/components/net/resource_task.rs). Hook up the header support to the XHR code. This might change depending on the status of the pending RequestWriter changes (though a makeshift wrapper for RequestWriter can be built for the scope of this project, and swapped out later on) 96 | 97 | - Pull request the above two points (separately) 98 | 99 | - Investigate the full range of possible errors and bad data input, this will be necessary later for testing. 100 | 101 | - Continue work on the testing framework. At this stage, one should be able to write simple tests where the details of server response is stored in something akin to Mochitest’s [*^headers^*](https://developer.mozilla.org/en/docs/Mochitest#How_do_I_change_the_HTTP_headers_or_status_sent_with_a_file_used_in_a_Mochitest.3F) file. While not all types of response details might be implemented in this portion of the summer, the basic framework should be up, and it should be easy to add support for more response details later. There will be certain tests that will require server-side checking of sent data. For these we can keep the test result in the response itself, and check it in the client. 102 | 103 | - June 8 - June 15: Add support for POST (application/x-www-form-urlencoded and text/plain) to send(). This will require using the changes to [*resource\_task.rs*](http://mxr.mozilla.org/servo/source/src/components/net/resource_task.rs) implemented in the previous week. A new struct, [*URLSearchParams*](http://url.spec.whatwg.org/#urlsearchparams) (similar to [*FormData*](http://mxr.mozilla.org/servo/source/src/components/script/dom/formdata.rs)), will have to be implemented that uses [*form\_urlencoded.rs*](https://github.com/SimonSapin/rust-url/blob/master/form_urlencoded.rs) for encoding. 104 | 105 | - June 15 - June 22: 106 | 107 | - Implement the other event listeners for the XHR object, with error handling. Also implement timeouts. 108 | 109 | - Start writing tests. Add options to the framework when needed. 110 | 111 | - Pull request the above two separately. I’m uncertain if bors will be able to handle the new framework without changes, so for now the tests may be kept independent of make check and will have to be run locally by others. 112 | 113 | - Prepare for midterm evaluation. (June 22) 114 | 115 | - June 22 - June 29: 116 | 117 | - Continue writing tests. Which tests are necessary will need to be discussed with the community (and gleaned from [*existing Firefox code*](http://mxr.mozilla.org/mozilla-central/source/dom/workers/test/)), so this could take a while. 118 | 119 | - Make bors work with the framework, if necessary. Hook the tests up with make check if we are reasonably sure of their reliability. It’s best if we avoid breaking tests for everyone. 120 | 121 | - Implement any remaining necessary attributes from the spec (eg responseType if necessary, etc) 122 | 123 | - Pull request above three points separately. 124 | 125 | - June 29 - July 6: By now, a usable XHR object should be implemented. In this week, I intend to test it out on existing non-jQuery websites. I will already have done some of the work for this with the dummy XHR object. Fix any issues that occur. 126 | 127 | - July 6 - July 13: Work on getting it to work with jQuery. Write some basic jQuery testcases and see what errors are thrown. Hopefully the issues that were filed during the bonding period will have been fixed (by me or someone else), otherwise work on them now. One specific thing that will need to be implemented is the ability to spoof the content-type header. While jQeury appears to send application/x-www-form-urlencoded data, internally it uses a urlencoded string passed to the object. 128 | 129 | - July 13 - July 20: Wrap up the bulk of the project, pull request any remaining code. Document it. (I intend to try to keep the wiki updated as the project goes on, but it might be better to document it at the end). 130 | 131 | - July 20 - Aug 18: Semester will start. This time is intended to be a buffer, in case some of the above spills over. If there is extra time, I can work on the following (if not, I can always look into these after GSoC gets over): 132 | 133 | - ProgressEvent support 134 | 135 | - EventHandler support if it doesn’t already exist. This includes hooking it up to the on\* attributes of XHR. 136 | 137 | - Other postdata types like text/html, multipart/form-data (this one might take time since it depends on [*formdata.rs*](https://github.com/mozilla/servo/blob/master/src/components/script/dom/formdata.rs)), and Blob (ditto for [*blob.rs*](https://github.com/mozilla/servo/blob/master/src/components/script/dom/blob.rs)) 138 | 139 | ### Open Source Development Experience 140 | 141 | - I’ve been contributing to firefox for a couple of months ([*bugzilla profile*](https://bugzilla.mozilla.org/user_profile?login=manishearth%40gmail.com)), mainly in Javascript. I also have recently started mentoring bugs, and in general enjoy helping new users get started. 142 | 143 | - I have a [*couple of commits*](https://github.com/mozilla/servo/commits?author=Manishearth) to servo itself 144 | 145 | - I am one of the developers for the Wikipedia Account Request system ([*repo*](https://github.com/enwikipedia-acc/waca), [*commits*](https://github.com/enwikipedia-acc/waca/commits?author=Manishearth)), though recently I haven’t been contributing there much due to time constraints. 146 | 147 | - I’ve helped plan and develop a couple of Stack Exchange-related apps in [*this organization*](https://github.com/Charcoal-SE), in collaboration with two others. 148 | 149 | - Quite a bit of my own projects are open source ([*on github*](https://github.com/Manishearth)). Some recent ones: 150 | 151 | - [*AnnoTabe*](http://softwarerecs.stackexchange.com/a/1913/19), a tab annotator for Chrome 152 | 153 | - [*Kapi*](https://github.com/Manishearth/Kapi), a Windows 8 app for fluid mathematical note-taking. This is in collaboration with two other students. 154 | 155 | - A [*noticeboard*](https://github.com/Manishearth/HostelNoticeboard) for Raspberry Pi with a centralized upload system. This is in collaboration with one other student. 156 | 157 | - Some [*StackExchange userscripts*](https://github.com/Manishearth/Manish-Codes) and a [*python wrapper for Stack Exchange chat*](https://github.com/Manishearth/ChatExchange) 158 | 159 | ### Work/Internship Experience 160 | 161 | As mentioned above, I have some experience with Firefox and servo. I’m been an active Javascript coder for many years now, and while I avoid the “design” portion of “web design”, I understand most of how javascript works. 162 | 163 | I’ve not done any programming/CS interns (mostly physics), but I have been contracted for a couple of scripts or website work in the past (eg, recently I wrote one for Mathoverflow). Most of my programming has been done as a volunteer in open source projects. 164 | 165 | ### Academic Experience 166 | 167 | I’m currently studying physics. I’ve always had a passion for both programming and physics. However, while it is possible for me to improve my programming on the side while studying physics (most of my programming was self-taught anyway), I don’t think it’s possible for me to keep in touch with physics when my courses will be programming courses. Physics is something that is significantly harder to teach oneself in my opinion. Of course, I’ll never reach the programming/CS level of a student who has that as their primary topic, but I guess some compromise has to be made if I want to enjoy both worlds. 168 | 169 | ### Why Mozilla 170 | 171 | I’ve already been participating in the Mozilla community for a while, mainly with small Firefox features. Compared to the other online communities I’ve participated in, Mozilla is the most welcoming, and I enjoy being a part of it. Till now I haven’t had a chance to do anything major here, and I feel that GSoC is a great way for me to do a major project. Additionally, Servo in particular is very much a work in progress, and I feel that I can have a better impact by implementing core features in a budding browser over enhancing an existing application. 172 | -------------------------------------------------------------------------------- /GSoC-2023/Accepted/PostgreSQL-IshaanAdarsh-postgres-extension-tutorial/PostgreSQL-IshaanAdarsh-postgres-extension-tutorial.md: -------------------------------------------------------------------------------- 1 | # GSoC 2023 PostgreSQL Project Proposal 2 | 3 | ## Postgres Extension Tutorial 4 | 5 | ### Basic Information: 6 | 7 | - **Name:** Ishaan Adarsh 8 | - **GitHub:** [https://github.com/IshaanAdarsh](https://github.com/IshaanAdarsh) 9 | - **Email:** ishaanad9@gamil.com 10 | - **LinkedIn:** [https://www.linkedin.com/in/ishaan-adarsh-161a56222/](https://www.linkedin.com/in/ishaan-adarsh-161a56222/) 11 | - **Location:** Bengaluru, India (GMT+5:30) 12 | 13 | ### About me: 14 | 15 | My name is Ishaan Adarsh, and I am a 2nd-year Engineering Student at the National Institute of Technology, Raipur. I am writing to express my interest in the Postgres extension tutorial/quick start (2023). I have experience in several programming languages, including C++, Solidity, HTML, CSS, ReactJS, and Svelte Kit. In addition, I have experience with SQL and database design, which I believe will be valuable in developing an effective tutorial for the Postgres extension system. 16 | 17 | ### Why I am interested in PostgreSQL: 18 | 19 | I have always been passionate about open-source. However, I didn’t have a chance to contribute to an actual open-source project. I hope to take the opportunity of GSoC to become a Postgres contributor, and I think it’s a good chance for me to get involved in the open-source community. 20 | 21 | I believe that open-source software is an essential tool for promoting collaboration and innovation, and I am excited to have the opportunity to contribute to a project that will make it easier for developers to contribute to Postgres. 22 | 23 | In addition to my technical skills, I am a strong communicator and collaborator. I believe that effective communication and collaboration are essential for success in any open-source project, and I am committed to working closely with the Postgres community to develop a tutorial that meets the needs of developers. Overall, I believe that my technical skills, passion for open-source software, and strong communication and collaboration skills make me an ideal candidate for the Postgres Extension Tutorial and Quick Start Guide project. 24 | 25 | ## Deliverables: 26 | 27 | 1. A comprehensive tutorial/quick start guide for writing Postgres extensions, covering the following topics: 28 | - Prerequisites for writing an extension 29 | - Starting an extension 30 | - Writing a Makefile 31 | - Using PGXS and PGXN 32 | - Using Procedural Languages and External Languages 33 | - Writing regression tests 34 | - Release management and upgradability 35 | - The tutorial should be easy to follow, with clear examples and step-by-step instructions. 36 | 37 | 2. A sample Postgres extension: 38 | - To illustrate the concepts covered in the tutorial, a sample Postgres extension should be developed using Python and JavaScript. 39 | - The extension should be well-documented and include regression tests to ensure it is working correctly. 40 | 41 | 3. Documentation for the sample extension: 42 | - Documentation should be provided for the sample extension, including a README file and any necessary installation instructions. 43 | 44 | 4. Integration with the Postgres documentation system: 45 | - The tutorial and sample extension should be integrated into the official Postgres documentation system to ensure they are easily accessible to users. 46 | 47 | 5. A set of best practices and guidelines for extension development, covering topics such as security, performance, and maintainability. 48 | 49 | 6. Testing and validation: 50 | - The tutorial, sample extension, and documentation should be tested and validated to ensure they are working correctly and meet the requirements outlined in the project description. 51 | - Collaboration with other Postgres developers to incorporate feedback and suggestions for improving the extension development process. 52 | 53 | 7. Ongoing maintenance and support (Post Work Program): 54 | - The project should include ongoing maintenance and support for the tutorial, sample extension, and documentation to ensure they stay up-to-date with changes to Postgres and continue to meet the needs of the community. 55 | 56 | ## Detailed Description: 57 | 58 | The Postgres database management system is a powerful and flexible platform that allows developers to build complex applications. One of the key features of Postgres is its extension system, which allows developers to extend the functionality of the database by writing their own custom extensions. However, the process of writing a Postgres extension can be daunting for new developers, as the documentation is often difficult to navigate, and the learning curve is steep. 59 | 60 | To address this issue, we propose to create a comprehensive tutorial and quick start guide for writing Postgres extensions, with the aim of lowering the barrier to entry for new contributors and making it easier for experienced developers to create high-quality extensions. 61 | 62 | ### It will cover the following topics: 63 | 64 | 1. **Prerequisites for writing an extension:** 65 | - This section will cover the knowledge needed to write an extension: 66 | - Postgres 67 | - Target languages (C, Python, JavaScript) 68 | - Tools and Software’s needed: 69 | - PostgreSQL Server 70 | - PGXS 71 | - PGXN 72 | - Git: Used in Postgres Extension Development to manage code changes, collaborate with other developers, and maintain a history of the codebase. 73 | - To make this part of the tutorial more accessible to novice developers, we can provide: 74 | - Step-by-step instructions to install and configure each of these tools and software. We can also provide code examples and snippets to help developers get started quickly. 75 | - Provide links to additional resources such as documentation and tutorials to help developers deepen their understanding of these tools and software. 76 | 77 | 2. **Starting an extension:** 78 | - This section will explain how to create a new extension, discuss the structure of an extension, and explain how to write extension code. 79 | - **Problem:** Novice developers may not be familiar with the process of creating a new extension in Postgres or understanding the structure of an extension. 80 | - **Solution:** 81 | - This can include the use of tools such as the pgxn command-line tool to automate the creation of extension scaffolding. 82 | - We can provide explanations of the various components of an extension, such as the control file and SQL script files, to help developers understand the structure of an extension. 83 | - To guide developers through writing extension code, we can provide examples of basic extension functionality, such as adding a new SQL function, with explanations of the code and how it works. 84 | - We can provide information on debugging and troubleshooting extension code, such as using pg_config to check for missing dependencies. 85 | 86 | 3. **Writing a Makefile:** 87 | - This section will discuss the role of a Makefile in extension development and provide an example Makefile. 88 | - **Problem:** Novice developers may not be familiar with Makefiles and their role in extension development, which can make it difficult for them to build and manage their extensions. 89 | - **Solution:** 90 | - Explain the purpose of Makefiles in extension development. 91 | - Create a sample Makefile that developers can use as a template for their own extensions. It should also include variables for defining the extension name, version number, and other metadata. The Makefile should include commands for: 92 | - Building the extension 93 | - Running regression tests 94 | - Installing the extension. 95 | - Offer troubleshooting advice for common issues that may arise when building and installing extensions using Makefiles. This can include tips for debugging Makefile syntax errors or resolving dependency issues. 96 | 97 | 4. **Using PGXS and PGXN:** 98 | - These sections will explain what PGXS and PGXN are, their role in extension development, and how to use them. 99 | - **Problem:** Novice developers might not be familiar with PGXS and PGXN, which can make it difficult for them to build, install, and distribute their extensions. 100 | - **Solution:** 101 | - The tutorial can start by defining what PGXS and PGXN are and why they are important in extension development. 102 | - We can then provide a step-by-step guide on how to use PGXS to build and install an extension, along with code snippets and examples. We can also explain how to use PGXN to distribute the extension and make it available to other users. 103 | - We can provide information on how to test the extension using PGXS and PGXN, and how to troubleshoot common issues that might arise during the build and installation process. By the end of this section, novice developers should be able to understand and use PGXS and PGXN to build, install, and distribute their extensions. 104 | 105 | 5. **Procedural Languages and External Languages:** 106 | - These sections will discuss the role of procedural and external languages in extension development and explain how to use them. 107 | - **Problem:** Novice developers may not be familiar with procedural and external languages or their role in Postgres extension development. 108 | - **Solution:** 109 | - Explain what procedural languages and external languages are, and their role in Postgres extension development. 110 | - Explain how to use procedural languages in extension development, including how to write functions and triggers using: 111 | - PL/pgSQL 112 | - PL/Python. 113 | - Provide examples of popular external languages used in Postgres extension development, such as: 114 | - JavaScript 115 | - Python. 116 | - Explain how to use external languages in extension development, including how to write functions and triggers using external languages, and how to compile and link external code to Postgres. 117 | - Provide code snippets and examples for each language, showing how to write simple functions and triggers using each language. 118 | - Explain how to test and debug functions and triggers written in the target language. 119 | 120 | 6. **Regression Tests:** 121 | - This section will discuss the importance of regression tests, explain how to write regression tests for an extension, and provide examples of regression tests. 122 | - **Problem:** Novice developers may not understand the importance of regression testing and may not know how to write regression tests for their extensions. 123 | - **Solution:** 124 | - Explain the importance of regression testing: Emphasize the importance of regression testing to catch bugs before they make it into production and cause problems for end-users. 125 | - Provide a step-by-step guide for writing regression tests for extensions. This guide should include the following steps: 126 | - Setup a test database: Create a new database specifically for testing the extension. This database should be separate from the production database. 127 | - Write test cases: Write test cases that cover all the features of the extension. Each test case should be designed to test a specific feature or scenario. 128 | - Write test scripts: Write test scripts in the target language (Python or JavaScript) to automate the testing process. 129 | - Run the tests: Run the test scripts against the test database to ensure that the extension is working as expected. 130 | - Provide examples of regression tests: To help developers understand how to write regression tests, provide code examples of regression tests in the target language. 131 | 132 | 7. **Release Management and Upgradability:** 133 | - This section will discuss best practices for managing releases and upgrades of extensions, as well as how to make an extension upgradable. 134 | - To ensure that the tutorial and sample code are accurate and up-to-date, we will collaborate with other Postgres developers to incorporate feedback and suggestions for improving the extension development process. We will also provide ongoing support and maintenance for the tutorial and sample code, including updating it to reflect changes in Postgres and addressing feedback from users. 135 | 136 | ### Approach for completion of the Postgres Extension Tutorial: 137 | 138 | As a beginner in Postgres, I bring a fresh perspective to the Postgres Extension Tutorial and Quick Start Guide project. My position affords me the opportunity to experience first-hand the challenges that novice programmers encounter when developing or utilizing extensions. 139 | 140 | My status as a relative newcomer to the topic of Postgres Extensions will allow me to approach the problem with a beginner's mindset, unencumbered by preconceived notions or assumptions about the process. This unique position enables me to provide valuable insight into the difficulties that others may face when attempting to navigate the Postgres extension system. 141 | 142 | **1. Planning Phase:** 143 | - The first phase of the project will involve planning and scoping the work required. 144 | - This will involve a detailed review of the existing documentation for the Postgres extension system and identifying areas that need improvement. 145 | - We will also identify the programming languages and tools required for the project, as well as any dependencies or third-party libraries needed. 146 | 147 | **2. Development Phase:** 148 | - The development phase will involve creating the tutorial and sample code, as well as the testing framework for Postgres extensions. 149 | - The tutorial will be written in a clear and concise language and structured in a way that is easy for developers to follow. 150 | - Sample code will be provided for each topic covered in the tutorial, demonstrating best practices for extension development. 151 | - The testing framework will be designed to allow developers to easily create and run regression tests for their extensions. 152 | 153 | **3. Collaboration and Feedback:** 154 | - Throughout the development phase, we will collaborate with other Postgres developers to incorporate feedback and suggestions for improving the extension development process. 155 | - This will involve sharing the tutorial and sample code with the Postgres community and actively seeking feedback through online forums and mailing lists. 156 | 157 | **4. Documentation Improvements:** 158 | - As part of the project, we will also improve the existing documentation for the Postgres extension system. 159 | - This will involve updating the official Postgres documentation with new content and clarifying existing content to make it more accessible to developers. Following the Postgres docguide which specifies how the documentation is formatted ([https://www.postgresql.org/docs/current/docguide.html](https://www.postgresql.org/docs/current/docguide.html)). 160 | 161 | **5. Best Practices and Guidelines:** 162 | - We will create a set of best practices and guidelines for extension development, based on our experience and feedback from the Postgres community. 163 | - These best practices will cover areas such as: 164 | - Code structure 165 | - Naming conventions 166 | - Release management. 167 | 168 | **6. Support and Maintenance:** 169 | - Once the tutorial and supporting resources are complete, we will provide ongoing support and maintenance for the project. 170 | - This will involve updating the tutorial and sample code to reflect changes in Postgres and addressing feedback from users. 171 | 172 | ### Approximate schedule: 173 | 174 | | **Week** | **Date Range** | **Tasks** | 175 | |----------|----------------|-----------| 176 | | Week 1 | 5/29-6/4 | - Planning and scoping: Identify areas for improvement and develop a plan for the tutorial. Define project milestones and deliverables. | 177 | | Week 2 | 6/5-6/11 | - Planning and scoping: Review existing Postgres extension system documentation. | 178 | | Week 3 | 6/12-6/18 | - Development phase – 1: Develop the tutorial framework and structure. | 179 | | Week 4 | 6/19-6/25 | - Development phase – 1: Write and test sample code for each topic covered in the tutorial. | 180 | | Week 5 | 6/26-7/2 | - Collaboration and Feedback: Share the tutorial and sample code with the Postgres community for feedback. | 181 | | Week 6 | 7/3-7/9 | - Collaboration and Feedback: Incorporate feedback and suggestions for improvement. Update the tutorial and sample code accordingly. | 182 | | Week 7 | 7/10-7/16 | - Development phase – 2: Develop the testing framework for Postgres extensions. | 183 | | Week 8 | 7/17-7/23 | - Development phase – 2: Write and test regression tests for sample code. | 184 | | Week 9 | 7/31-8/6 | - Documentation Improvements: Clarify existing content to make it more accessible to developers. | 185 | | Week 10 | 8/7-8/13 | - Documentation Improvements: Improve the official Postgres extension system documentation with new content. | 186 | | Week 11 | 8/14-8/20 | - Best Practices and Guidelines: Develop a set of best practices and guidelines for extension development. | 187 | | Week 12 | 8/21-8/26 | - Support and Maintenance: Provide ongoing support and maintenance for the tutorial, sample extension, and documentation to ensure they stay up-to-date with changes in Postgres and continue to meet the needs of the community. | 188 | 189 | --------------------------------------------------------------------------------