├── .github └── ISSUE_TEMPLATE │ ├── pacback-bug-report-template.md │ └── pacback-feature-request.md ├── .gitignore ├── .gitmodules ├── CODE_OF_CONDUCT.md ├── DEVELOPER_GUIDE.md ├── LICENSE ├── README.md ├── USER_GUIDE.md ├── build ├── BUILDPKG_FULL_TEMPLATE ├── BUILDPKG_GIT_TEMPLATE ├── INSTALL_TEMPLATE ├── config ├── make-full-release.sh └── make-git-release.sh └── core ├── create.py ├── custom_dirs.py ├── error.py ├── meta.py ├── pacback.py ├── restore.py ├── session.py ├── user.py ├── utils.py └── version.py /.github/ISSUE_TEMPLATE/pacback-bug-report-template.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Pacback Bug Report Template 3 | about: Basic Bug Template 4 | title: 'Bug Title' 5 | labels: '' 6 | assignees: JustinTimperio 7 | 8 | --- 9 | 10 | **Describe the Bug:**\ 11 | A clear and concise description of what the bug is. 12 | 13 | **Output of Command** 14 | - Output sent to the user in terminal 15 | - Log output in /var/log/pacback.log 16 | - Config file in /etc/pacback/config 17 | 18 | **Machine (please fill in the following information):** 19 | - Kernel Version: (`uname -r`) 20 | - Python Version: (`python3 --version`) 21 | - Python Rich Version (`pacman -Q | grep python-rich`) 22 | - Python Requests Version (`pacman -Q | grep python-requests`) 23 | - Pacback Version: (`sudo pacback -v`) 24 | - Install Type: (pacback vs pacback-git) 25 | 26 | 27 | **To Reproduce** 28 | Steps to reproduce the behavior: 29 | 1. Go to '...' 30 | 2. Click on '....' 31 | 3. Scroll down to '....' 32 | 4. See error 33 | 34 | **Expected behavior** 35 | A clear and concise description of what you expected to happen. 36 | 37 | **Screenshots** 38 | If applicable, add screenshots to help explain your problem. 39 | 40 | **Additional context** 41 | Add any other context about the problem here. 42 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/pacback-feature-request.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Pacback Feature Request 3 | about: Suggest an idea for this project 4 | title: '' 5 | labels: '' 6 | assignees: '' 7 | 8 | --- 9 | 10 | **Is your feature request related to a problem? Please describe.** 11 | A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] 12 | 13 | **Describe the solution you'd like** 14 | A clear and concise description of what you want to happen. 15 | 16 | **Describe alternatives you've considered** 17 | A clear and concise description of any alternative solutions or features you've considered. 18 | 19 | **Additional context** 20 | Add any other context or screenshots about the feature request here. 21 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | core/__pycache__/ 2 | -------------------------------------------------------------------------------- /.gitmodules: -------------------------------------------------------------------------------- 1 | [submodule "core/paf"] 2 | path = core/paf 3 | url = https://github.com/JustinTimperio/paf.git 4 | -------------------------------------------------------------------------------- /CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | # Contributor Covenant Code of Conduct 2 | 3 | ## Our Pledge 4 | 5 | In the interest of fostering an open and welcoming environment, we as 6 | contributors and maintainers pledge to making participation in our project and 7 | our community a harassment-free experience for everyone, regardless of age, body 8 | size, disability, ethnicity, sex characteristics, gender identity and expression, 9 | level of experience, education, socio-economic status, nationality, personal 10 | appearance, race, religion, or sexual identity and orientation. 11 | 12 | ## Our Standards 13 | 14 | Examples of behavior that contributes to creating a positive environment 15 | include: 16 | 17 | * Using welcoming and inclusive language 18 | * Being respectful of differing viewpoints and experiences 19 | * Gracefully accepting constructive criticism 20 | * Focusing on what is best for the community 21 | * Showing empathy towards other community members 22 | 23 | Examples of unacceptable behavior by participants include: 24 | 25 | * The use of sexualized language or imagery and unwelcome sexual attention or 26 | advances 27 | * Trolling, insulting/derogatory comments, and personal or political attacks 28 | * Public or private harassment 29 | * Publishing others' private information, such as a physical or electronic 30 | address, without explicit permission 31 | * Other conduct which could reasonably be considered inappropriate in a 32 | professional setting 33 | 34 | ## Our Responsibilities 35 | 36 | Project maintainers are responsible for clarifying the standards of acceptable 37 | behavior and are expected to take appropriate and fair corrective action in 38 | response to any instances of unacceptable behavior. 39 | 40 | Project maintainers have the right and responsibility to remove, edit, or 41 | reject comments, commits, code, wiki edits, issues, and other contributions 42 | that are not aligned to this Code of Conduct, or to ban temporarily or 43 | permanently any contributor for other behaviors that they deem inappropriate, 44 | threatening, offensive, or harmful. 45 | 46 | ## Scope 47 | 48 | This Code of Conduct applies both within project spaces and in public spaces 49 | when an individual is representing the project or its community. Examples of 50 | representing a project or community include using an official project e-mail 51 | address, posting via an official social media account, or acting as an appointed 52 | representative at an online or offline event. Representation of a project may be 53 | further defined and clarified by project maintainers. 54 | 55 | ## Enforcement 56 | 57 | Instances of abusive, harassing, or otherwise unacceptable behavior may be 58 | reported by contacting the project team at justintimperio@gmail.com. All 59 | complaints will be reviewed and investigated and will result in a response that 60 | is deemed necessary and appropriate to the circumstances. The project team is 61 | obligated to maintain confidentiality with regard to the reporter of an incident. 62 | Further details of specific enforcement policies may be posted separately. 63 | 64 | Project maintainers who do not follow or enforce the Code of Conduct in good 65 | faith may face temporary or permanent repercussions as determined by other 66 | members of the project's leadership. 67 | 68 | ## Attribution 69 | 70 | This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4, 71 | available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html 72 | 73 | [homepage]: https://www.contributor-covenant.org 74 | 75 | For answers to common questions about this code of conduct, see 76 | https://www.contributor-covenant.org/faq 77 | -------------------------------------------------------------------------------- /DEVELOPER_GUIDE.md: -------------------------------------------------------------------------------- 1 | # Pacback Developer Guide 2 | 3 | This documentation is for anyone looking to gain a deeper understanding of how pacback works and as a reference guide for those contributing to its development. If you are interested in contributing to pacback, [join us on Gitter!](https://gitter.im/pacback/Development) 4 | 5 | **Table of Contents:** 6 | 7 | - [Design Philosophy](https://github.com/JustinTimperio/pacback/blob/master/DEVELOPER_GUIDE.md#design-philosophy) 8 | - [Code Structure Overview](https://github.com/JustinTimperio/pacback/blob/master/DEVELOPER_GUIDE.md#code-structure-overview) 9 | - [Metadata Files](https://github.com/JustinTimperio/pacback/blob/master/DEVELOPER_GUIDE.md#metadata-files) 10 | - [Restore Points](https://github.com/JustinTimperio/pacback/blob/master/DEVELOPER_GUIDE.md#restore-points) 11 | - [Snapshots](https://github.com/JustinTimperio/pacback/blob/master/DEVELOPER_GUIDE.md#snapshots) 12 | - [SubType - Light](https://github.com/JustinTimperio/pacback/blob/master/DEVELOPER_GUIDE.md#subtype---light) 13 | - [SubType - Full](https://github.com/JustinTimperio/pacback/blob/master/DEVELOPER_GUIDE.md#subtype---full) 14 | - [Feature Path](https://github.com/JustinTimperio/pacback/blob/master/DEVELOPER_GUIDE.md#feature-path) 15 | - [Known Bugs and Issues](https://github.com/JustinTimperio/pacback/blob/master/DEVELOPER_GUIDE.md#known-bugs-and-issues) 16 | 17 | ## Design Overview: 18 | 19 | Pacback is written entirely in python3 and attempts to implement most features natively. Since its release, pacback has been aggressively optimized, which has greatly reduced each session's overall runtime. Pacback offers several utilities and automation tools but is primarily designed to use two core restore methods: **Restore Points and Snapshots** 20 | 21 | ## Design Philosophy 22 | 23 | As pacback has grown into a full application, a design philosophy has emerged to meet the demands of the application. The following four pillars inform every part of pacback's design and are ranked in order of importance. Each pillar builds on the previous and MUST NOT compromise the previous. 24 | 25 | ### Resilience and Safety 26 | 27 | First and foremost, pacback is a tool designed for those encountering errors during an upgrade/change of their system. If pacback is unable to complete a restoration safely for the user, it has ZERO value as an application. Even worse than reaching some internal issue that causes an execution failure, is the improper restoration of packages or permanent damage a system's bootability. Pacback has a zero-tolerance mentality when approaching these types of errors and any **internal** command or function that has even a .01% of breaking someone's system. (Currently, pacback cannot catch and correct pacman errors but I hope to add this in the future.) This mentality extends deep into every piece of pacback's code. 28 | 29 | Pacback is designed to be extremely resilient in terms of its runtime and can automatically correct errors that most applications would consider radical edge cases. For example in the event a user has destroyed their mirrorlist or it was, by some ludicrously improbable event destroyed by pacback, a new mirrorlist can automatically be fetched without disrupting the restoration process. Pacback also has correction measures for duplicate cached packages, file corruption, lost files, user input errors, and multiple session overlap, just to name a few. 30 | 31 | ### Efficiency and Performance 32 | 33 | Pacback takes efficiency and performance very seriously since snapshots are created during every pacman transaction. If any part of pacback's execution takes too long, it starts to compromise the ultimate utility of the application. For the most part, pacback has managed to stuff 90% of its total runtime into the space of 50ms-300ms depending on the complexity of the command and the speed of the os drive. It is important to distinguish pacback's internal execution and the programs that it relies on like pacman. The time it takes for pacman to fetch and install packages is not included, as pacback ultimately has no control over that process. 34 | 35 | ### Automate Everything 36 | 37 | From the beginning, pacback was designed as an automation tool. In most cases, if a task can automatically be resolved without a user, pacback attempts to automate it. While this obviously includes the process of upgrading or downgrading packages, it also includes cache cleaning, kernel change detection, and restore point organization. It is important to note, even though any of these processes could require virtually zero user input, pacback tends to ask the user before any change is actually committed to the system unless the `-no_confirm` flag is used. 38 | 39 | ### Functional Modular Code 40 | 41 | Pacback is written in exclusively a functional style which you can learn more about [here](https://docs.python.org/3/howto/functional.html). This is done for a wide variety of reasons but most notably it produces highly modular code that can be easily abstracted across the codebase. Each function is treated as 'input/output' and nearly every function in the codebase expects some input and will return some result to be used by another function. The next section will go into much more detail about how the code is internally structured but for example, the file meta.py provides all the functions needed to read and interpret metadata files. If you were, say building a new feature that required you to read metadata files you could simply: 42 | 43 | import meta 44 | import session 45 | 46 | config = session.load_config() 47 | meta_dict = meta.read(config, meta_path) 48 | print(meta_dict) 49 | 50 | ## Code Structure Overview 51 | 52 | ### [pacback.py](https://github.com/JustinTimperio/pacback/blob/master/core/pacback.py) 53 | 54 | This file acts as the main entry for pacback input and is softlinked in /usr/bin. This file is responsible for parsing and verifying all user input. This does not mean that input won't be rejected down the line, but that it is safe for functions down the line to process. This file is also responsible for starting the active session lock and loading the config file that will then be passed to all lower functions. 55 | 56 | ### [create.py](https://github.com/JustinTimperio/pacback/blob/master/core/create.py) 57 | 58 | As the name implies, this file is responsible for all of pacback's creation functions. Since the addition of snapshots, there was a need for a more 'generic' create command. To cleanly allow for this, `create.restore_point()` and `create.snapshot()`assemble variables and prepares the system for its perspective creation process. Once complete these functions then pass their 'work' to `create.main()` for the actual assembly onto the file system. When the creation is complete `create.main()` terminates and returns to the origin function for final completion. 59 | 60 | ### [restore.py](https://github.com/JustinTimperio/pacback/blob/master/core/restore.py) 61 | 62 | As the name implies, this file is responsible for all of pacback's restoration functions. Since pacback offers a diverse set of rollback commands, only `restore.restore_point()` and `restore.snapshot()` access the 'generic' `restore.main()` function. As with create.py, these functions prepare the system for their perspective restoration tasks, then hand off to `restore.main()` for the process of restoring the actual packages. This file also contains `restore.packages()` which allows for users to rollback individual packages and `restore.archive_date()` which lets the user fallback on a date in the Arch Linux Archive. 63 | 64 | ### [session.py](https://github.com/JustinTimperio/pacback/blob/master/core/session.py) 65 | 66 | As pacback became more complicated, it became necessary to create and manage active session locks to prevent overlaps and collisions between multiple pacback and pacman sessions. This file is also responsible for creating and interpreting hook locks which are used to determine if a snapshot has been created recently. This is especially useful for AUR upgrades which typically upgrade one package at a time meaning that a single `yay -Syu` would produce a snapshot for every package being updated. 67 | 68 | ### [user.py](https://github.com/JustinTimperio/pacback/blob/master/core/user.py) 69 | 70 | This file is meant for any functions built explicitly for user management for pacback files. In most cases these functions don't impact the file system and simply organize and display information for the user. 71 | 72 | ### [utils.py](https://github.com/JustinTimperio/pacback/blob/master/core/utils.py) 73 | 74 | Utils.py is the backbone of pacback. These functions act as much of the core abstraction across pacback and is reserved for functions that are expected to be used by multiple other functions. Most of the code in this file has been highly optimized as it is referenced multiple times across the codebase. There are far too many functions in this file to cover here but, by far, the most important functions are `utils.scan_caches()` and `utils.search_cache()`. 75 | 76 | ### [meta.py](https://github.com/JustinTimperio/pacback/blob/master/core/meta.py) 77 | 78 | Meta is responsible for the process of reading and interpreting raw meta data files. This is one of the only files that interacts with raw metadata files and interperts them into structured dictionaries for much easier abstraction throughout the codebase. This also extends to the act of comparing metadata files who's results are also parsed into dictionaries. 79 | 80 | ### [version.py](https://github.com/JustinTimperio/pacback/blob/master/core/version.py) 81 | 82 | This is currently a small file that is currently only responsible for comparing pacback versions for logging reasons. 83 | 84 | ### [custom_dirs.py](https://github.com/JustinTimperio/pacback/blob/master/core/custom_dirs.py) 85 | 86 | This file is responsible for managing all aspects of storing, comparing and restoring custom user files. 87 | 88 | ### [PAF](https://github.com/JustinTimperio/paf) 89 | 90 | PAF or Python-Application-Framework is my personal framework for building and managing multiple python projects. This is easily the most referenced entity in the codebase and is responsible for most 'low level' work. Typically this module attempts to be as pragmatic as possible. While many pip modules exist that may solve some of these issues, PAF has a 'do it yourself' mentality. 91 | 92 | ## Metadata Files 93 | 94 | Metadata files are pacback primary stored 'data structure' and contain information in a human-readable format about packages installed at the time of its creation along with other relevant system information. This information is used by pacback to restore package versions and provide general information to the user. Each metadata file will look something like this: 95 | 96 | ======= Pacback Info ======= 97 | Version: 2.0.0 98 | Label: None 99 | Date Created: 2020/08/21 100 | Time Created: 16:21:53 101 | Type: Restore Point 102 | SubType: Full 103 | Packages Installed: 295 104 | Packages Cached: 294 105 | Package Cache Size: 1.52 GB 106 | Dir File Count: 25675 107 | Dir Raw Size: 1.63 GB 108 | Tar Compressed Size: 539.73 MB 109 | Tar Checksum: 18b87bf457f7aedb8a39a8ccf5a9dfc6 110 | 111 | ========= Dir List ========= 112 | /home/conductor/test-core 113 | 114 | ======= Pacman List ======== 115 | libedit 20191231_3.1-1 116 | libmicrohttpd 0.9.71-1 117 | ..... 118 | .... 119 | ... 120 | 121 | ## Restore Points 122 | 123 | ### Light Restore Points 124 | 125 | By default, pacback creates a Light restore point which consists of only a .meta file. When you fall back on this restore point, pacback will search your file system looking for old versions specified in the .meta file. If you have not cleared your cache or are rolling back a recent upgrade, Light restore points provide extremely fast and effective rollbacks. 126 | 127 | **Advantages:** 128 | 129 | - Light RP's are extremely small (~25KB) 130 | - Generating a Light RP's is fast (~50-100 milliseconds) 131 | - Low overhead means no impact on pacman upgrade times 132 | 133 | **Disadvantages:** 134 | 135 | - Light RP's will fail to provide value if old package versions are removed every week (aka. paccahe -r) 136 | 137 | ### Full Restore Points 138 | 139 | When the full flag is used, pacback searches through your file system looking for each package version installed. Pacback then creates a restore point folder which contains a hardlink to each compiled package ('package.pkg.tar.zst') installed on the system at the time of its creation, along with any additional files the user specifies. 140 | 141 | Full Restore Points also generate a .meta file with additional information needed for the Full restore point. When you fallback on a Full restore point, pacback runs its normal package checks giving you the ability to rollback packages and remove any new packages added since its creation. Once this is complete, if you have any config files saved, pacback will checksum each file and compare it to your file system. Pacback will then let you selectively overwrite each subsection of file type: Changed, Added, and Removed. 142 | 143 | **Advantages:** 144 | 145 | - Full RP's are 100% self contained 146 | - Adding custom directories allows for the rollback of config files 147 | - Full RP's ensure that packages are not prematurely removed 148 | - Provides Light restore points additional resilience 149 | 150 | **Disadvantages:** 151 | 152 | - Building full restore points takes slightly longer than light restore points depending on IO speed. 153 | 154 | ## Snapshots 155 | 156 | Snapshots are a new addition and act as an auto-incrementing fallback point. Snapshots are simply a single metadata file but exist separate from restore points. Pacback installs an alpm hook that is triggered each time pacman makes a change to the system. This includes all installs, removals and upgrades made to the system. As the pacback creates new snapshots, each previous snapshot is shifted forward one slot until it reaches the max number defined by the user. 157 | 158 | ![Snapshot Architecture](https://i.imgur.com/smxMBK8.jpg) 159 | 160 | ## SubType - Light 161 | 162 | Subtypes are a more generic abstraction that arose out of the need for multiple restoration types. The light subtype implies that only a metadata file has been generated. In the future if any new additions are added to pacback, they will fall into the subtypes light or full. 163 | 164 | ## SubType - Full 165 | 166 | Subtypes are a more generic abstraction that arose out of the need for multiple restoration types. The full subtype inplies that a metadata file has be generated along with a folder that contains all the packages needed for that restoration. Since each package is hardlinked to an inode, a package can be referenced an infinite number of times without duplication. A package will not be fully deleted from the system until all references to the inode are removed. This also provides light restore points additional resilience as they will automatically search full restore points for the packages they need. 167 | 168 | ![Pacback Inodes](https://i.imgur.com/eikZF2g.jpg) 169 | 170 | ## Feature Path 171 | 172 | - [x] Version Checking 173 | - [x] Version Migration 174 | - [x] Improved Cache and Restore Point Cleaning 175 | - [x] Pacman Hook 176 | - [x] Improved Searches for Individual Packages 177 | - [x] Internal Logging 178 | - [x] PEP8 Compliance(ish) 179 | - [x] Multi-Threaded Package Searches and Filtering 180 | - [x] Linux Filesystem Hierarchy Compliance 181 | - [x] Fix Checksumming 182 | - [x] AUR Package(s) 183 | - [x] Improved Internal Documentation 184 | - [x] Session Management 185 | - [x] Add Snapshot Cooldown Lock 186 | - [x] Retain Multiple Snapshots 187 | - [x] Better Color Output 188 | - [x] Add --diff to Compare Two Meta Files 189 | - [x] Improved SigInt (Ctrl-C) Handling 190 | - [ ] Parse/Intercept pacman output and errors 191 | - [ ] Automatic Correction of pacman Errors 192 | - [ ] Automated Code Testing 193 | - [ ] Full CI Pipeline for Releases 194 | - [ ] Support for Fetching Non-Cached Package Versions 195 | - [ ] Orchestrated Upgrades for Production Systems 196 | - [ ] Human Readable Timeline for Snapshots 197 | 198 | ## Known Bugs and Issues 199 | 200 | This list is likely to change as new versions are released. Please read this carefully when updating versions or deploying pacback to new systems. If you run into any errors or are about to submit a bug, please check your log file located in '/var/log/pacback.log'. 201 | 202 | - **Pacback Skips Checksumming Files over 5GB.** - This is done for several reasons. First, Python sucks at reading large files. In my testing, checksumming took 30x-50x longer compared to the terminal equivalent. Second, storing and comparing large files is not really pacback's use-case. Packaging directories into a restore point is intended for saving the state of potentially thousands of small configuration files, not large archives or databases. 203 | 204 | - **Fixed:** ~~**Pacback Creates Missing Directories as Root.** - Currently files are copied out of the restore point with the exact same permissions they went in with. The issue here is the creation of missing directories. When Pacback creates these directories the original permissions are not copied.~~ 205 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2020 Justin E Timperio 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Pacback 2 | ![AUR Main](https://img.shields.io/aur/version/pacback?label=AUR%20Stable&style=for-the-badge) 3 | ![AUR Git](https://img.shields.io/aur/version/pacback-git?label=AUR%20git&style=for-the-badge) 4 | ![Codacy grade](https://img.shields.io/codacy/grade/e7e837f43d794a919070a608642390f4?label=Codacy%20Grade&style=for-the-badge) 5 | ![GitHub](https://img.shields.io/github/license/justintimperio/pacback?style=for-the-badge) 6 | 7 | **Index:** 8 | 9 | 1. [CLI Commands](https://github.com/JustinTimperio/pacback#pacback-cli-commands-and-flags) 10 | 2. [Install Instructions](https://github.com/JustinTimperio/pacback#install-instructions) 11 | 3. [User Guide](https://github.com/JustinTimperio/pacback/blob/master/USER_GUIDE.md) 12 | 4. [Developer Guide](https://github.com/JustinTimperio/pacback/blob/master/DEVELOPER_GUIDE.md) 13 | 14 | ## Abstract: 15 | 16 | Being at the head of Linux kernel and application development means access to the latest features but also often means dealing with the latest bugs. While I don't run into major bugs often, when they happen, they cripple my productivity. Reversing individual packages is generally a slow manual process and while some tools exist, none meet my needs. In particular, support for downgrading AUR packages is extremely lacking. To combat these issues I wrote pacback to automate various downgrade methods for restoring Arch Linux to a previous version state. 17 | 18 | ## Core Features: 19 | 20 | - Resilient Downgrades and Upgrades 21 | - Rolling System Snapshots 22 | - Rollback to Arch Archive Dates 23 | - Easy Tracking of All System Additions, Removals, and Upgrades 24 | - Native Support for AUR Packages 25 | - Storage and Restoration of Version Dependent Files 26 | - Multi-Threaded Operations 27 | 28 | ## Pacback CLI Commands and Flags: 29 | 30 | Pacback offers several core commands that streamline the process of creating and restoring versions. The CLI is designed to be dead simple and provide detailed feedback and user control. 31 | 32 | ### Core Commands 33 | 34 | - \-c, --create_rp | Generate a pacback restore point. Takes a restore point # as an argument.\ 35 | _Example: `pacback -c 1`_ 36 | - \-rp, --restore_point | Rollback to a previously generated restore point.\ 37 | _Example: `pacback -rp 1`_ 38 | - \-ss, --snapshot | Restore the system to an automatically created snapshot.\ 39 | _Example: `pacback -ss 2`_ 40 | - \-dt, --date | Rollback to a date in the Arch Linux Archive.\ 41 | _Example: `pacback -dt 2019/08/14`_ 42 | - \-pkg, --package | - Rollback a list of packages looking for old versions on the system.\ 43 | _Example: `pacback -pkg zsh cpupower neovim`_ 44 | 45 | ### Flags 46 | 47 | - \-f, --full_rp | Generate a pacback full restore point.\ 48 | _Example: `pacback -f -c 1`_ 49 | - \-d, --add_dir | Add any custom directories to your restore point during a `--create_rp AND --full_rp`.\ 50 | _Example: `pacback -f -c 1 -d /dir1/to/add /dir2/to/add /dir3/to/add`_ 51 | - \-nc, --no_confirm | Skip asking user questions during RP creation. Will answer yes to most input.\ 52 | _Example: `pacback -nc -c 1`_ 53 | - \-l, --label | Add a label to your restore point.\ 54 | _Example: `pacback -nc -c 1 -f -l 'Production'`_ 55 | 56 | ### Print Info 57 | 58 | - \-ls, --list | List information about all restore points and snapshots.\ 59 | _Example: `pacback -ls`_ 60 | - \-i, --info | Print information about a retore point or snapshot.\ 61 | _Example: `pacback -i rp1` or `pacback -i ss1`_ 62 | - \-df, --diff | Compare any two restore points or snapshots.\ 63 | _Example: `pacback -df rp1 rp2` or `pacback -df rp1 ss1`_ 64 | - \-v, --version | Display pacback version and cache information.\ 65 | _Example: `pacback -v`_ 66 | 67 | ### Utilities 68 | 69 | - \-cache, --cache_size | Calculate reported and actual cache sizes.\ 70 | _Example: `pacback -cache`_ 71 | - \-cl, --clean | Clean old and orphaned pacakages along with old restore points.\ 72 | _Example: `pacback -cl`_ 73 | - \-rm, --remove | Removes the selected restore point.\ 74 | _Example: `pacback -rm 12 -nc`_ 75 | - \--install_hook | Install a pacman hook that creates a snapshot during each pacman transaction.\ 76 | _Example: `pacback --install_hook`_ 77 | - \--remove_hook | Removes the pacman hook that creates snapshots.\ 78 | _Example: `pacback --remove_hook`_ 79 | 80 | ## Install Instructions: 81 | 82 | Pacback offers two AUR packages. (Special thanks to [Attila Greguss](https://github.com/Gr3q) for maintaining them.) 83 | 84 | [pacback](https://aur.archlinux.org/packages/pacback): This is the recommended install for most users. Releases mark stable points in Pacbacks development, preventing unnecessary upgrades/changes that may introduce instability into production machines. 85 | 86 | [pacback-git](https://aur.archlinux.org/packages/pacback-git): This package fetches the latest version from git. The master branch will be unstable periodically but is ideal for anyone looking to contribute to pacback's development or if you want access to the latest features and patches. 87 | 88 | ## User Guide 89 | 90 | While there are only a few CLI commands, they can be used in a wide variety of complex restoration tasks. The user guide has grown quite extensively in size and has been moved to its own page! [Check it out here!](https://github.com/JustinTimperio/pacback/blob/master/USER_GUIDE.md) 91 | 92 | ## Developer Guide 93 | 94 | Interested in helping develop pacback? Have questions about how it works? The detailed developer guide explains all the core features, codebase, and design philosophy of pacback. [Check it out here!](https://github.com/JustinTimperio/pacback/blob/master/DEVELOPER_GUIDE.md) 95 | -------------------------------------------------------------------------------- /USER_GUIDE.md: -------------------------------------------------------------------------------- 1 | # Pacback User Guide 2 | 3 | **Table of Contents:** 4 | 5 | - [Get Snapshot and Restore Point Info](https://github.com/JustinTimperio/pacback/blob/master/USER_GUIDE.md#getting-snapshot-and-restore-point-info) 6 | - [Using Rolling System Snapshots](https://github.com/JustinTimperio/pacback/blob/master/USER_GUIDE.md#rolling-system-snapshots) 7 | - [Creating Permanent Restore Points](https://github.com/JustinTimperio/pacback/blob/master/USER_GUIDE.md#creating-permanent-restore-points) 8 | - [Rollback a List of Packages](https://github.com/JustinTimperio/pacback/blob/master/USER_GUIDE.md#rollback-a-list-of-packages) 9 | - [Rolling Back to an Archive Date](https://github.com/JustinTimperio/pacback/blob/master/USER_GUIDE.md#rolling-back-to-an-archive-date) 10 | - [Backup Version Sensitive Application Data](https://github.com/JustinTimperio/pacback/blob/master/USER_GUIDE.md#backup-version-sensitive-application-data) 11 | - [Automated Cache Cleaning](https://github.com/JustinTimperio/pacback/blob/master/USER_GUIDE.md#automated-cache-cleaning) 12 | - [Using the User Config File](https://github.com/JustinTimperio/pacback/blob/master/USER_GUIDE.md#using-the-user-config-file) 13 | 14 | ## Getting Snapshot and Restore Point Info 15 | 16 | As you accumulate snapshots(SS) and restore points(RP) it can be easy to lose track of the changes you’ve made to the system. Pacback makes it easy not only to get information about an individual RP or SS but also easily compare any of your RP’s or SS’s. 17 | 18 | Let's say you want to get information about a restore point you created a few weeks ago. All you need to do is run `pacback -i rp30`: 19 | 20 | ![Get Info](https://i.imgur.com/oZexzd3.png) 21 | 22 | You can also compare any two restore points or snapshots easily with the `--diff` command. Lets say you want to compare a restore point you created a few days ago and snapshot that was just generated a few hours ago. All you need to do is run `pacback -df rp11 ss5`: 23 | 24 | ![Diff Command](https://i.imgur.com/ghZoi95.png) 25 | 26 | ## Rolling System Snapshots 27 | 28 | One of the problems with rolling releases is you never know when a problem might occur. It may be months before you run into an issue, at which point, you will need to scramble to figure out when your system was stable last. By using the integrated pacman hook, Pacback creates a restore point every time you make any change to the system. This means at any point you can revert your system to any point in time without creating a restore point ahead of time. This also gives a high degree of granularity when making many small changes throughout the day. 29 | 30 | 1. Make a change to your system: `pacman -S tree rsync htop` 31 | 2. Run `pacback -ls` and you should see a new snapshot in slot `#00`. Each time you make a change(add, remove, upgrade any package) a snapshot will be created during the transaction. 32 | 3. To undo all the changes you just made simply `pacback -ss 0`. 33 | 34 | ![Pacback Snapshot](https://i.imgur.com/GE61yqe.gif) 35 | 36 | ## Creating Permanent Restore Points 37 | 38 | Remember that one time all your packages were working perfectly? (God that was great.) Have a production system running perfectly that needs to be updated but you don't want to backup the whole disk? With pacback restore points, you don't have to lose that version state. Restore points are user-defined version states that describe a set of packages and even configuration files on your system that you don't want to lose track of. Unlike many backup utilities pacback doesn't need backup the whole disk. Instead, it hardlinks packages with inodes without duplicating the files so that pacback can maintain the smallest possible profile on your system. 39 | 40 | To create a restore point, then get information about it: 41 | 42 | 1. `pacback -nc -f -c 1 -l 'Production'` 43 | 2. `pacback -i rp1` 44 | 45 | ![Restore Points](https://i.imgur.com/5f5d5HI.gif) 46 | 47 | ## Rollback a List of Packages 48 | 49 | Most issues introduced by an upgrade stem from a single package or a set of related packages. Pacback allows you to selectively rollback a list of packages using `pacback -pkg`. Packback searches your file system looking for all versions associated with each package name. When searching, Pacback attempts to avoid matching generic names used by multiple packages (I.E. _xorg_ in _xorg_-server, _xorg_-docs, _xorg_-xauth). If no packages are found, the search parameters can be widened but it will likely show inaccurate results. 50 | 51 | In this example, we selectively rollback 2 packages. 52 | 53 | 1. `pacback -pkg typescript neofetch` 54 | 55 | ![Pacback Rolling Back a List of Packages](https://i.imgur.com/9bd4YRB.gif) 56 | 57 | ## Rolling Back to an Archive Date 58 | 59 | Another popular way to rollback is to fetch packages directly from the Arch Linux Archives using pacman. Pacback automates this entire process with the `pacback -dt` command. To rollback to a specific date, give `-dt` a date in YYYY/MM/DD format and Pacback will automatically save your mirrorlist, point a new mirrorlist to an archive URL, then run a full system downgrade. 60 | 61 | 1. `pacback -rb 2019/10/18` 62 | 63 | ![Pacback Rolling Back an Archive Date](https://i.imgur.com/jhkeoCF.gif) 64 | 65 | ## Backup Version Sensitive Application Data 66 | 67 | In some cases, config files may need to be modified when updating packages. In other cases, you may want to backup application data before deploying an upgrade in case of error or corruption. Pacback makes it extremely simple to store these files and will automatically compare files you have stored against your current file system. Once checksumming is complete you can selectively overwrite each subsection of file type: Changed, Added, and Removed. 68 | 69 | In this example we pack up a custom directory containing over 10k files. 70 | 71 | 1. `pacback -c 1 -f -nc -d /home/conductor/test-core` 72 | 2. `pacback -rp 1` 73 | 74 | ![Pacback Saving App Data](https://i.imgur.com/Us8LqGj.gif) 75 | 76 | ## Automated Cache Cleaning 77 | 78 | Pacback is first and foremost an automation tool and this extends to system maintenance. Cleaning and managing the cached packages on a system is something that many of us forget to do. Pacback has an inbuilt function that lets you easily clean old and orphaned packages. It will also check each one of your restore points looking for old ones that can be purged from the system. 79 | 80 | ![Pacback Cache Cleaning](https://i.imgur.com/UeL2H9B.gif) 81 | 82 | ## Using The User Config File 83 | 84 | Pacback comes pre-configured out of the box, but for many users, there may be the need to customize pacback to meet the needs of their particular use case. Pacback creates a file `/etc/pacback.conf` which users can modify to meet their needs. This includes modifying some of pabacks lower level features like snapshot locks, and also allows the user to outright disable some features. 85 | 86 | Below is the preconfigured config file: 87 | 88 | ``` 89 | ## Pacback Config File 90 | ## Version 2.0.0 91 | 92 | ## Mandatory Settings 93 | 94 | # Number Of Seconds Before The Snapshot Lock Expires 95 | # MUST be an INT 96 | hook_cooldown = 300 97 | 98 | # Max Number Of Snapshots To Keep 99 | # MUST be an INT 100 | max_snapshots = 25 101 | 102 | # Let The User Schedule a Reboot if Needed. 103 | # If False Pacback Will Only Notify You 104 | # MUST be True or False 105 | reboot = True 106 | 107 | ## Optional Settings 108 | 109 | # Number Of Minutes In Future To Schedule Reboot 110 | # Only Runs After The Kernel Has Changed 111 | # MUST be an INT 112 | reboot_offset = 5 113 | 114 | # Number of Old Cached Versions To Keep 115 | # When Running a Pacback Cache Clean 116 | # MUST be an INT 117 | keep_versions = 3 118 | 119 | # Number Of Days Before an RP is Flagged Old 120 | # MUST be an INT 121 | old_rp = 180 122 | 123 | ``` 124 | -------------------------------------------------------------------------------- /build/BUILDPKG_FULL_TEMPLATE: -------------------------------------------------------------------------------- 1 | # Maintainer: Attila Greguss 2 | # Co-Maintainer/Author: Justin Timperio 3 | 4 | pkgname=pacback 5 | pkgver=_VERSION_ 6 | pkgrel=0 7 | pkgdesc='Advanced Version Control for Arch Linux' 8 | arch=('x86_64') 9 | url='https://github.com/JustinTimperio/pacback' 10 | license=('MIT') 11 | provides=('pacback') 12 | conflicts=('pacback-git') 13 | # avoid overwriting modified config files 14 | install='pacback.install' 15 | backup=('etc/pacback.conf') 16 | depends=('python' 'python-rich' 'python-requests' 'pacman-contrib') 17 | makedepends=('zstd') 18 | optdepends=('pigz: Multithreaded de/compression of custom user files') 19 | source=('https://github.com/JustinTimperio/pacback/releases/download/v_VERSION_/_PACKAGE_') 20 | sha512sums=('_PKG_CHECKSUM_') 21 | 22 | 23 | package() { 24 | cd "${srcdir}" 25 | install -dm 755 "${pkgdir}"{/usr/{share/{pacback,pacback/paf,licences/pacback},bin/},/etc} 26 | install -dm 1777 "${pkgdir}"/tmp 27 | cp -dr --no-preserve='ownership' core "${pkgdir}"/usr/share/pacback 28 | cp -dr --no-preserve='ownership' LICENSE "${pkgdir}"/usr/share/licences/pacback 29 | cp -dr --no-preserve='ownership' config "${pkgdir}"/etc/pacback.conf 30 | ln -sf /usr/share/pacback/core/pacback.py "${pkgdir}"/usr/bin/pacback 31 | } 32 | -------------------------------------------------------------------------------- /build/BUILDPKG_GIT_TEMPLATE: -------------------------------------------------------------------------------- 1 | # Maintainer: Attila Greguss 2 | # Co-Maintainer/Author: Justin Timperio 3 | 4 | pkgname=pacback-git 5 | pkgver=_VERSION_ 6 | pkgrel=0 7 | pkgdesc='Advanced Version Control for Arch Linux' 8 | arch=('x86_64') 9 | url='https://github.com/JustinTimperio/pacback' 10 | license=('MIT') 11 | provides=('pacback') 12 | conflicts=('pacback') 13 | # avoid overwriting modified config files 14 | install='pacback.install' 15 | backup=('etc/pacback.conf') 16 | depends=('python' 'python-rich' 'python-requests' 'pacman-contrib') 17 | makedepends=('zstd') 18 | optdepends=('pigz: Multithreaded de/compression of custom user files') 19 | source=('git+https://github.com/JustinTimperio/pacback.git') 20 | sha256sums=('SKIP') 21 | 22 | 23 | pkgver() { 24 | cd "${srcdir}/pacback" 25 | # Get the version number. 26 | printf "r%s.%s" "$(git rev-list --count HEAD)" "$(git rev-parse --short HEAD)" 27 | } 28 | 29 | prepare() { 30 | cd "${srcdir}/pacback" 31 | git submodule init core/paf 32 | git config submodule.paf.url "$srcdir/paf" 33 | git submodule update 34 | } 35 | 36 | package() { 37 | cd "${srcdir}" 38 | install -dm 755 "${pkgdir}"{/usr/{share/{pacback,pacback/paf,licences/pacback},bin/},/etc} 39 | install -dm 1777 "${pkgdir}"/tmp 40 | cp -dr --no-preserve='ownership' pacback/core "${pkgdir}"/usr/share/pacback 41 | cp -dr --no-preserve='ownership' pacback/LICENSE "${pkgdir}"/usr/share/licences/pacback 42 | cp -dr --no-preserve='ownership' pacback/build/config "${pkgdir}"/etc/pacback.conf 43 | ln -sf /usr/share/pacback/core/pacback.py "${pkgdir}"/usr/bin/pacback 44 | } 45 | -------------------------------------------------------------------------------- /build/INSTALL_TEMPLATE: -------------------------------------------------------------------------------- 1 | # Maintainer: Attila Greguss 2 | # Co-Maintainer/Author: Justin Timperio 3 | 4 | post_install() { 5 | # Installs Snapshot Hook 6 | pacback -ih 7 | 8 | # Make Base Dirs 9 | mkdir -p /var/lib/pacback 10 | mkdir -p /var/lib/pacback/restore-points 11 | mkdir -p /var/lib/pacback/snapshots 12 | } 13 | 14 | post_upgrade() { 15 | ## Patch pacman Hook Location 16 | if [ -f "/etc/pacman.d/hooks/pacback.hook" ]; then 17 | mv /etc/pacman.d/hooks/pacback.hook /usr/share/libalpm/hooks/pacback.hook 18 | fi 19 | 20 | ## Fix For Broken Pacakge Versions 21 | mkdir -p /var/lib/pacback 22 | mkdir -p /var/lib/pacback/restore-points 23 | mkdir -p /var/lib/pacback/snapshots 24 | 25 | ## Run Alpha Upgrade 26 | # Fix First Line 27 | find /var/lib/pacback -type f -name '*.meta' -exec sed -i '1 s/^====== Pacback RP.*/======= Pacback Info =======/' {} + 28 | 29 | # Fix Version Field 30 | find /var/lib/pacback -type f -name '*.meta' -exec sed -i 's/^Pacback Version:/Version:/' {} + 31 | 32 | # Add Type Fields 33 | find /var/lib/pacback -type f -name '*.meta' -exec sed -i '/^Time Created:.*/a Type: Restore Point' {} + 34 | find /var/lib/pacback -type f -name '*.meta' -exec sed -i 's/^Packages in RP: 0/SubType: Light/' {} + 35 | 36 | # Remove Fields If No Packages Are Cached 37 | find /var/lib/pacback -type f -name '*.meta' -exec sed -i '/^Size of Packages in RP: 0B/d' {} + 38 | 39 | # Add SubType Field 40 | find /var/lib/pacback -type f -name '*.meta' -exec sed -i '/^Packages in RP:.*/i SubType: Full' {} + 41 | 42 | # Fix Fields If Package Are Cached 43 | find /var/lib/pacback -type f -name '*.meta' -exec sed -i 's/^Packages in RP:/Packages Cached:/' {} + 44 | find /var/lib/pacback -type f -name '*.meta' -exec sed -i 's/^Size of Packages in RP:/Package Cache Size:/' {} + 45 | 46 | # Fix Custom Dir Fields 47 | find /var/lib/pacback -type f -name '*.meta' -exec sed -i 's/^Dirs /Dir /' {} + 48 | 49 | # Change Package Cache Folder Name 50 | find /var/lib/pacback/restore-points -type d -name 'pac_cache' -exec rename 'pac_cache' 'pkg-cache' {} + 51 | } 52 | 53 | pre_remove() { 54 | # Removes Snapshot Hook 55 | pacback -rh 56 | } 57 | 58 | post_remove() { 59 | # Removes Cached User Data 60 | rm -Rf /var/lib/pacback 61 | } 62 | -------------------------------------------------------------------------------- /build/config: -------------------------------------------------------------------------------- 1 | ## Pacback Config File 2 | ## Version 2.0.0 3 | 4 | ## Mandatory Settings 5 | 6 | # Number Of Seconds Before The Snapshot Lock Expires 7 | # MUST be an INT 8 | hook_cooldown = 300 9 | 10 | # Max Number Of Snapshots To Keep 11 | # MUST be an INT 12 | max_ss = 25 13 | 14 | # Let The User Schedule a Reboot if Needed. 15 | # If False Pacback Will Only Notify You 16 | # MUST be True or False 17 | reboot = True 18 | 19 | ## Optional Settings 20 | 21 | # Number Of Minutes In Future To Schedule Reboot 22 | # Only Runs After The Kernel Has Changed 23 | # MUST be an INT 24 | reboot_offset = 5 25 | 26 | # Number of Old Cached Versions To Keep 27 | # When Running a Pacback Cache Clean 28 | # MUST be an INT 29 | keep_versions = 3 30 | 31 | # Number Of Days Before an RP is Flagged Old 32 | # MUST be an INT 33 | old_rp = 180 34 | -------------------------------------------------------------------------------- /build/make-full-release.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | ################ 4 | # Define Vars 5 | ############## 6 | 7 | script_path=$(realpath $0) 8 | build_path=$(dirname $script_path) 9 | repo_path=$(dirname $build_path) 10 | pkg_version=$(grep -Poi '\d+\.\d+\.\d+' $repo_path/core/session.py) 11 | tar_name=$(echo pacback-$pkg_version-SOURCE.tar) 12 | zst_name=$(echo pacback-$pkg_version-SOURCE.tar.zst) 13 | base_path='/tmp/pacback' 14 | tar_path=$base_path/$tar_name 15 | zst_path=$base_path/$tar_name.zst 16 | buildpkg_path=$base_path/PKGBUILD 17 | srcinfo_path=$base_path/.SRCINFO 18 | pacback_install_path=$base_path/pacback.install 19 | 20 | 21 | #################### 22 | # Setup For Build 23 | ################## 24 | 25 | rm -fR $base_path 26 | mkdir $base_path 27 | cd $repo_path 28 | 29 | 30 | ########################## 31 | # Build Release Package 32 | ######################## 33 | 34 | # Add Config to Tar 35 | cd build 36 | find . -maxdepth 1 -type f -name 'config' -exec tar -rvf $tar_path --owner=0 --group=0 {} + 37 | 38 | # Add License to Tar 39 | cd .. 40 | find . -maxdepth 1 -type f -name 'LICENSE' -exec tar -rvf $tar_path --owner=0 --group=0 {} + 41 | 42 | # Add Core Files to Tar 43 | cd core 44 | find . -maxdepth 1 -type f -name '*.py' -exec tar -rvf $tar_path --owner=0 --group=0 --transform 's,.,core,' {} + 45 | 46 | # Add PAF to Tar 47 | cd paf 48 | find . -maxdepth 1 -type f -name '*.py' -exec tar -rvf $tar_path --owner=0 --group=0 --transform 's,.,core/paf,' {} + 49 | 50 | # Compress Source Package 51 | zstd -z $tar_path 52 | pkg_csum=$(sha512sum $zst_path | cut -d " " -f 1) 53 | rm $tar_path 54 | 55 | 56 | ######################### 57 | # Create Build Package 58 | ####################### 59 | 60 | # Make BUILDPKG 61 | cp $build_path/BUILDPKG_FULL_TEMPLATE $buildpkg_path 62 | cp $build_path/INSTALL_TEMPLATE $pacback_install_path 63 | sed -i "s/_VERSION_/$pkg_version/" $buildpkg_path 64 | sed -i "s/_PACKAGE_/$zst_name/" $buildpkg_path 65 | sed -i "s/_PKG_CHECKSUM_/$pkg_csum/" $buildpkg_path 66 | 67 | # Make SRCINFO 68 | cd $base_path 69 | makepkg --printsrcinfo > $srcinfo_path 70 | makepkg 71 | -------------------------------------------------------------------------------- /build/make-git-release.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | ################ 4 | # Define Vars 5 | ############## 6 | 7 | pkg_version=$(printf "r%s.%s" "$(git rev-list --count HEAD)" "$(git rev-parse --short HEAD)") 8 | base_path='/tmp/pacback-git' 9 | buildpkg_path=$base_path/PKGBUILD 10 | srcinfo_path=$base_path/.SRCINFO 11 | pacback_install_path=$base_path/pacback.install 12 | 13 | 14 | #################### 15 | # Setup For Build 16 | ################## 17 | 18 | script_path=$(realpath $0) 19 | dir_path=$(dirname $script_path) 20 | rm -fR $base_path 21 | mkdir $base_path 22 | cd $dir_path 23 | 24 | 25 | ######################### 26 | # Create Build Package 27 | ####################### 28 | 29 | # Make BUILDPKG 30 | cp $dir_path/BUILDPKG_GIT_TEMPLATE $buildpkg_path 31 | cp $dir_path/INSTALL_TEMPLATE $pacback_install_path 32 | sed -i "s/_VERSION_/$pkg_version/" $buildpkg_path 33 | 34 | # Make SRCINFO 35 | cd $base_path 36 | makepkg --printsrcinfo > $srcinfo_path 37 | makepkg 38 | -------------------------------------------------------------------------------- /core/create.py: -------------------------------------------------------------------------------- 1 | #! /usr/bin/env python3 2 | import os 3 | import datetime as dt 4 | 5 | # Local Modules 6 | import paf 7 | import utils 8 | import session 9 | import custom_dirs 10 | 11 | 12 | ######################### 13 | # Main Create Function 14 | ####################### 15 | 16 | def main(config, info): 17 | ''' 18 | This is pacbacks main method for orchestrating the creation of a 19 | fallback point. It shouldn't be called directly with create.main() 20 | but rather by a 'higher' level call that stages system for the 21 | actual creation process. 22 | ''' 23 | fname = 'create.main(' + info['type'] + info['id'] + ')' 24 | paf.write_to_log(fname, 'Building ID:' + info['id'] + ' As ' + info['STYPE'] + ' ' + info['TYPE'], config['log']) 25 | 26 | # Light Restore Point 27 | if info['STYPE'] == 'Light': 28 | if info['dir_list']: 29 | session.abort_fail(fname, 'Custom Dirs Are Not Allowed With STYPE: ' + info['STYPE'], 30 | 'Light ' + info['TYPE'] + ' DO NOT Support Custom Dirs! Please Use The `-f` Flag', config) 31 | # Full Restore Point 32 | elif info['STYPE'] == 'Full': 33 | pkg_search = paf.replace_spaces(utils.pacman_Q(), '-') 34 | found_pkgs = utils.search_cache(pkg_search, utils.scan_caches(config), config) 35 | pkg_size = paf.size_of_files(found_pkgs) 36 | 37 | # Ask About Missing Pkgs 38 | if len(found_pkgs) != len(pkg_search): 39 | paf.write_to_log(fname, 'Not All Packages Where Found!', config['log']) 40 | pkg_split = utils.trim_pkg_list(found_pkgs) 41 | print('') 42 | paf.prBold('======================================') 43 | paf.prWarning('The Following Packages Were NOT Found!') 44 | paf.prBold('======================================') 45 | for pkg in set(pkg_search - pkg_split): 46 | paf.prWarning(pkg) 47 | print('') 48 | 49 | if info['nc'] is False: 50 | if paf.yn_frame('Do You Still Want to Continue?') is False or None: 51 | session.abort(fname, 'User Aborted Due to Missing Pkgs', 52 | 'Aborting Creation!', config) 53 | 54 | # Make Folders and Hardlink Packages 55 | paf.mk_dir(info['path'], sudo=False) 56 | paf.mk_dir(info['pkgcache'], sudo=False) 57 | 58 | for pkg in found_pkgs: 59 | os.link(pkg, info['pkgcache'] + '/' + paf.basename(pkg)) 60 | paf.write_to_log(fname, 'HardLinked ' + str(len(found_pkgs)) + ' Packages', config['log']) 61 | 62 | # Search Custom Dir's 63 | if info['dir_list']: 64 | paf.write_to_log(fname, 'User Selected Version Dependent Folders For Storage', config['log']) 65 | pack_results = custom_dirs.store(config, info) 66 | 67 | # Generate Meta Data File 68 | current_pkgs = utils.pacman_Q() 69 | meta = [ 70 | '======= Pacback Info =======', 71 | 'Version: ' + config['version'], 72 | 'Label: ' + info['label'], 73 | 'Date Created: ' + dt.datetime.now().strftime("%Y/%m/%d"), 74 | 'Time Created: ' + dt.datetime.now().strftime("%H:%M:%S"), 75 | 'Type: ' + info['TYPE'], 76 | 'SubType: ' + info['STYPE'], 77 | 'Packages Installed: ' + str(len(current_pkgs)) 78 | ] 79 | 80 | if info['STYPE'] == 'Full': 81 | meta.append('Packages Cached: ' + str(len(found_pkgs))) 82 | meta.append('Package Cache Size: ' + paf.convert_size(pkg_size)) 83 | 84 | if info['dir_list']: 85 | meta.append('Dir File Count: ' + str(pack_results['file_count'])) 86 | meta.append('Dir Raw Size: ' + pack_results['raw_size']) 87 | meta.append('Tar Compressed Size: ' + pack_results['compressed_size']) 88 | meta.append('Tar Checksum: ' + pack_results['csum']) 89 | 90 | meta.append('') 91 | meta.append('========= Dir List =========') 92 | for d in info['dir_list']: 93 | meta.append(d) 94 | 95 | meta.append('') 96 | meta.append('======= Pacman List ========') 97 | for pkg in current_pkgs: 98 | meta.append(pkg) 99 | 100 | # Export Final Meta Data File 101 | paf.export_iterable(info['meta'], meta) 102 | paf.write_to_log(fname, 'Generated Meta Data File', config['log']) 103 | # Checksum Meta Data File 104 | paf.export_iterable(info['meta_md5'], [paf.checksum_file(info['meta'])[1]]) 105 | paf.write_to_log(fname, 'Generated Meta Data Checksum', config['log']) 106 | # Finish and Return 107 | paf.write_to_log(fname, 'Main Build Complete of ID:' + info['id'] + ' As ' + info['STYPE'] + ' ' + info['TYPE'], config['log']) 108 | 109 | 110 | #################### 111 | # Create Snapshot 112 | ################## 113 | 114 | def snapshot(config, label): 115 | ''' 116 | Assembles all the info for main() and stages the file system for the creation 117 | of a new snapshot with id='00'. This is only called by `--hook`. 118 | ''' 119 | num = '00' 120 | fname = 'create.snapshot(' + num + ')' 121 | paf.write_to_log(fname, 'Started Snapshot Creation...', config['log']) 122 | session.hlock_check(config) 123 | 124 | info = { 125 | 'id': num, 126 | 'type': 'ss', 127 | 'TYPE': 'Snapshot', 128 | 'stype': 'l', 129 | 'STYPE': 'Light', 130 | 'nc': True, 131 | 'label': str(label), 132 | 'meta': config['ss_paths'] + '/ss' + num + '.meta', 133 | 'meta_md5': config['ss_paths'] + '/.ss' + num + '.md5', 134 | 'dir_list': [], 135 | 'path': config['ss_paths'] + '/ss' + num, 136 | 'pkgcache': config['ss_paths'] + '/ss' + num + '/pkg-cache', 137 | 'tar': config['ss_paths'] + '/ss' + num + '/ss' + num + '_dirs.tar' 138 | } 139 | 140 | # Shift Snapshots Forward So This Becomes Zero 141 | if os.path.exists(config['ss_paths'] + '/ss00.meta'): 142 | paf.write_to_log(fname, 'Shifting All Snapshots Forward +1...', config['log']) 143 | 144 | # Remove the Last Snapshot 145 | paf.rm_file(config['ss_paths'] + '/ss' + str(config['max_ss']).zfill(2) + '.meta', sudo=False) 146 | paf.rm_file(config['ss_paths'] + '/ss' + str(config['max_ss']).zfill(2) + '.md5', sudo=False) 147 | 148 | # Moves Each Snapshot Forward +1 and Cleans on Exceptions 149 | for n in range((config['max_ss'] - 1), -1, -1): 150 | meta_path_old = config['ss_paths'] + '/ss' + str(n).zfill(2) + '.meta' 151 | meta_path_new = config['ss_paths'] + '/ss' + str(n + 1).zfill(2) + '.meta' 152 | hash_path_old = config['ss_paths'] + '/.ss' + str(n).zfill(2) + '.md5' 153 | hash_path_new = config['ss_paths'] + '/.ss' + str(n + 1).zfill(2) + '.md5' 154 | meta_found = os.path.exists(meta_path_old) 155 | csum_found = os.path.exists(hash_path_old) 156 | 157 | if meta_found and csum_found: 158 | os.rename(meta_path_old, meta_path_new) 159 | os.rename(hash_path_old, hash_path_new) 160 | 161 | elif meta_found and not csum_found: 162 | paf.write_to_log(fname, 'Snapshot ' + str(n).zfill(2) + ' is Missing it\'s Checksum File!', config['log']) 163 | paf.rm_file(meta_path_old, sudo=False) 164 | paf.write_to_log(fname, 'Removed Snapshot ID:' + str(n).zfill(2), config['log']) 165 | 166 | elif not meta_found and csum_found: 167 | paf.write_to_log(fname, hash_path_old + ' is an Orphaned Checksum', config['log']) 168 | paf.rm_file(hash_path_old, sudo=False) 169 | paf.write_to_log(fname, 'Removed Orphaned Checksum', config['log']) 170 | 171 | else: 172 | pass 173 | 174 | paf.write_to_log(fname, 'Finished Shifting Snapshots Forward', config['log']) 175 | 176 | else: 177 | paf.write_to_log(fname, 'Snapshot ID:00 Was Not Found, Shift Forward is Unnecessary.', config['log']) 178 | 179 | # Creates Snapshot After Pre-Transaction Work and Checks 180 | paf.write_to_log(fname, 'All Checks Passed! Ready For Snapshot Creation', config['log']) 181 | paf.prBold('Creating Snapshot...') 182 | main(config, info) 183 | 184 | # Prevents Back-to-Back Snapshots(Especially During AUR Upgrades) 185 | session.hlock_start(config) 186 | paf.write_to_log(fname, 'Snapshot Creation Complete!', config['log']) 187 | paf.prBold('Snapshot Creation Complete!') 188 | 189 | 190 | ######################### 191 | # Create Restore Point 192 | ####################### 193 | 194 | def restore_point(config, num, full_rp, dir_list, no_confirm, label): 195 | ''' 196 | Assembles all the info for main() and stages the file system 197 | for the creation of a restore point. It is assumed that user input 198 | has been cleansed by this point. 199 | ''' 200 | num = str(num).zfill(2) 201 | fname = 'create.restore_point(' + num + ')' 202 | paf.write_to_log(fname, 'Started Restore Point Creation...', config['log']) 203 | 204 | info = { 205 | 'id': num, 206 | 'type': 'rp', 207 | 'TYPE': 'Restore Point', 208 | 'stype': 'f' if full_rp is True else 'l', 209 | 'STYPE': 'Full' if full_rp is True else 'Light', 210 | 'nc': no_confirm, 211 | 'label': str(label), 212 | 'meta': config['rp_paths'] + '/rp' + num + '.meta', 213 | 'meta_md5': config['rp_paths'] + '/.rp' + num + '.md5', 214 | 'dir_list': dir_list, 215 | 'path': config['rp_paths'] + '/rp' + num, 216 | 'pkgcache': config['rp_paths'] + '/rp' + num + '/pkg-cache', 217 | 'tar': config['rp_paths'] + '/rp' + num + '/rp' + num + '_dirs.tar' 218 | } 219 | 220 | # Check for Pre-Existing Restore Point 221 | if os.path.exists(info['meta']) or os.path.exists(info['path']): 222 | paf.prWarning('Restore Point #' + info['id'] + ' Already Exists!') 223 | 224 | if info['nc'] is False: 225 | if paf.yn_frame('Do You Want to Overwrite It?') is False or None: 226 | session.abort(fname, 'User Aborted Overwrite of RP #' + info['id'], 'Aborting Creation!', config) 227 | utils.remove_id(config, info) 228 | 229 | # Create Restore Point After Checks 230 | paf.write_to_log(fname, 'All Checks Passed! Handing Off to create.main()', config['log']) 231 | paf.prBold('Building ' + info['STYPE'] + ' ' + info['TYPE'] + ' ' + info['id'] + '...') 232 | main(config, info) 233 | 234 | # Finish After Successful Creation 235 | paf.write_to_log(fname, 'Restore Point Creation Complete!', config['log']) 236 | paf.prBold('Restore Point Creation Complete!') 237 | -------------------------------------------------------------------------------- /core/custom_dirs.py: -------------------------------------------------------------------------------- 1 | #! /usr/bin/env python3 2 | # I want to say this is my least favorite part of the whole codebase 3 | # I almost regret adding this feature but its here now 4 | # The code is not documented to the same standard because I hate 5 | # working on this part of the codebase. 6 | 7 | import re 8 | import os 9 | import shutil 10 | import pickle 11 | import tarfile 12 | import tempfile 13 | from rich.progress import track 14 | 15 | # Local Modules 16 | import paf 17 | import utils 18 | 19 | 20 | ################################################# 21 | # Make Missing Directories With Original Perms 22 | ############################################### 23 | 24 | def make_missing_dirs(config, unpack_path, p_len): 25 | ''' 26 | This is an add on function that restores permissions to folders. 27 | This was a known bug for all of alpha but is finally patched. 28 | Folder permissions aren't stored in a tar so a separate system 29 | was created to handle this using pickle and paf functions. 30 | ''' 31 | fname = 'custom_dirs.make_missing_dirs()' 32 | # Find All Subdirs 33 | dirs = paf.find_subdirs(unpack_path) 34 | # Sort in Subdirs in Descending Paths 35 | dirs.sort(key=lambda x: x.count('/')) 36 | 37 | for d in dirs: 38 | if not os.path.exists(d[p_len:]): 39 | os.makedirs(d[p_len:]) 40 | 41 | if os.path.exists(unpack_path + '/folder_permissions.pickle'): 42 | # Load Folder Permissions Pickle 43 | folder_perms = pickle.load(open(unpack_path + '/folder_permissions.pickle', 'rb')) 44 | 45 | for x in folder_perms: 46 | os.system('chmod ' + paf.perm_to_num(x[1]) + ' ' + paf.escape_bash_input(x[0])) 47 | os.system('chown ' + x[2] + ':' + x[3] + ' ' + paf.escape_bash_input(x[0])) 48 | 49 | else: 50 | paf.write_to_log(fname, 'Folder Permissions Pickle is Missing!', config['log']) 51 | 52 | 53 | ################################ 54 | # Clean Up and Repack On Exit 55 | ############################## 56 | 57 | def repack(config, info, unpack_path): 58 | ''' 59 | Cleans up after comparing an already created custom tar. 60 | ''' 61 | fname = 'custom_dirs.repack()' 62 | paf.rm_dir(unpack_path, sudo=False) 63 | paf.write_to_log(fname, 'Cleaned Up Unpacked Files', config['log']) 64 | 65 | if os.path.exists(info['tar']): 66 | # Re-Compress Custom Tar 67 | print('Re-Compressing Tar...') 68 | if any(re.findall('pigz', l.lower()) for l in utils.pacman_Q()): 69 | os.system('/usr/bin/pigz ' + info['tar'] + ' -f') 70 | else: 71 | paf.gz_c(info['tar'], rm=True) 72 | paf.write_to_log(fname, 'Compressed ' + info['tar'], config['log']) 73 | 74 | 75 | ################################ 76 | # Compare Files With Checksum 77 | ############################## 78 | 79 | def compare_files(config, dir_list, unpack_path, p_len): 80 | ''' 81 | Compares and unpacked custom user files against the current system. 82 | Returns a dict of added, removed and changed files on the system. 83 | ''' 84 | fname = 'custom_dirs.compare_files()' 85 | # Core Compare Results 86 | diff_added = set() 87 | diff_removed = set() 88 | diff_large = set() 89 | diff_noread = set() 90 | diff_changed = set() 91 | 92 | # Compare Checksums For Files That Exist 93 | paf.write_to_log(fname, 'Started Sorting and Comparing Files...', config['log']) 94 | 95 | # Search Directories 96 | unpack_files = paf.find_files(unpack_path) 97 | current_files = paf.find_files(dir_list) 98 | 99 | # Find Added Files and Remove From Csum Queue 100 | diff_added.update(current_files - {f[p_len:] for f in unpack_files}) 101 | current_files.difference_update(diff_added) 102 | 103 | # Find Removed Files and Trim From Csum Queue 104 | diff_removed.update(unpack_files - {unpack_path + f for f in current_files}) 105 | unpack_files.difference_update(diff_removed) 106 | try: 107 | diff_removed.remove(unpack_path + '/folder_permissions.pickle') 108 | except KeyError: 109 | paf.write_to_log(fname, 'Error: Couldn\'t Find Permission Pickle.', config['log']) 110 | 111 | # Only Checksum Files That Exist in Both Current AND Unpack 112 | paf.write_to_log(fname, 'Started Checksumming Custom Files...', config['log']) 113 | unpack_csum = paf.checksum_files(unpack_files, output='Checksumming Stored Files') 114 | current_csum = paf.checksum_files(current_files, output='Checksumming Current Files') 115 | paf.write_to_log(fname, 'Finished Checksumming Custom Files', config['log']) 116 | 117 | # Find Exceptions and Trim 118 | for csum in unpack_csum: 119 | if csum[1] == 'TOO LARGE!': 120 | diff_large.add(csum) 121 | unpack_csum.remove(csum) 122 | paf.write_to_log(fname, csum[0] + ' Was Too Large To Checksum!', config['log']) 123 | elif csum[1] == 'UNREADABLE!': 124 | diff_noread.add(csum) 125 | unpack_csum.remove(csum) 126 | paf.write_to_log(fname, csum[0] + ' Was Unreadable!', config['log']) 127 | 128 | for csum in current_csum: 129 | if csum[1] == 'TOO LARGE!': 130 | diff_large.add(csum) 131 | current_csum.remove(csum) 132 | paf.write_to_log(fname, csum[0] + ' Was Too Large To Checksum!', config['log']) 133 | elif csum[1] == 'UNREADABLE!': 134 | diff_noread.add(csum) 135 | current_csum.remove(csum) 136 | paf.write_to_log(fname, csum[0] + ' Was Unreadable!', config['log']) 137 | 138 | # Find Changed Files 139 | diff_changed.update(current_csum - {(tpl[0][p_len:], tpl[1]) for tpl in unpack_csum}) 140 | paf.write_to_log(fname, 'Finished Comparing and Sorting Files', config['log']) 141 | 142 | compare_results = { 143 | 'added': diff_added, 144 | 'removed': diff_removed, 145 | 'changed': diff_changed, 146 | 'large': diff_large, 147 | 'noread': diff_noread 148 | } 149 | 150 | return compare_results 151 | 152 | 153 | ################################## 154 | # Full Overwrite W/out Checksum 155 | ################################ 156 | 157 | def force_overwrite(config, unpack_path, p_len): 158 | ''' 159 | Restore Files Without Checksum 160 | ''' 161 | fname = 'custom_dirs.force_overwrite()' 162 | 163 | # Allow Exit Since This Is Bad Idea 164 | paf.prWarning('OVERWRITING FILES WITHOUT CHECKSUMS CAN BE EXTREMELY DANGEROUS!') 165 | if paf.yn_frame('Do You Still Want to Continue and Restore ALL The Files You Stored?') is False: 166 | return 167 | 168 | # Overwrite Files 169 | paf.write_to_log(fname, 'Starting Force Overwrite Process...', config['log']) 170 | print('Starting Full File Restore! Please Be Patient As All Files are Overwritten...') 171 | fs_stored = paf.find_files(unpack_path) 172 | try: 173 | fs_stored.remove(unpack_path + '/folder_permissions.pickle') 174 | except Exception: 175 | pass 176 | make_missing_dirs(config, unpack_path, p_len) 177 | for f in track(fs_stored, description='Overwriting Files'): 178 | shutil.move(f, f[p_len:]) 179 | 180 | paf.prSuccess('Done Overwriting Files!') 181 | paf.write_to_log(fname, 'Finished Force Overwrite Of Files', config['log']) 182 | 183 | 184 | ##################################### 185 | # Overwrite Using Checksum Results 186 | ################################### 187 | 188 | def smart_overwrite(config, csum_results, unpack_path, p_len): 189 | ''' 190 | Main File Restoration Logic 191 | ''' 192 | fname = 'custom_dirs.smart_overwrite()' 193 | 194 | if csum_results['changed']: 195 | paf.write_to_log(fname, 'Found ' + str(len(csum_results['changed'])) + ' Changed Files', config['log']) 196 | print('') 197 | print('#################################') 198 | paf.prWarning('The Following Files Have Changed:') 199 | print('#################################') 200 | print('') 201 | for f in list(csum_results['changed']): 202 | paf.prChanged(f[0]) 203 | print('') 204 | 205 | if paf.yn_frame('Do You Want to Restore ' + str(len(csum_results['changed'])) + ' Files That Have Been CHANGED?') is True: 206 | for f in track(csum_results['changed'], description='Restoring Changed Files'): 207 | shutil.move(unpack_path + f[0], f[0]) 208 | paf.write_to_log(fname, 'Restored Changed Files', config['log']) 209 | else: 210 | paf.write_to_log(fname, 'User Declined Restoring Changed Files', config['log']) 211 | 212 | if csum_results['removed']: 213 | paf.write_to_log(fname, 'Found ' + str(len(csum_results['removed'])) + ' Removed Files', config['log']) 214 | print('') 215 | print('######################################') 216 | paf.prWarning('The Following Files Have Been Removed:') 217 | print('######################################') 218 | print('') 219 | for f in list(csum_results['removed']): 220 | paf.prRemoved(f[p_len:]) 221 | print('') 222 | 223 | if paf.yn_frame('Do You Want to Restore ' + str(len(csum_results['removed'])) + ' Files That Have Been REMOVED?') is True: 224 | make_missing_dirs(config, unpack_path, p_len) 225 | for f in track(csum_results['removed'], description='Restoring Removed Files'): 226 | os.shutil(f, f[p_len:]) 227 | paf.write_to_log(fname, 'Restored Removed Files', config['log']) 228 | else: 229 | paf.write_to_log(fname, 'User Declined Restoring Removed Files', config['log']) 230 | 231 | if csum_results['added']: 232 | paf.write_to_log(fname, 'Found ' + str(len(csum_results['added'])) + ' New Files', config['log']) 233 | print('') 234 | print('####################################') 235 | paf.prWarning('The Following Files Have Been Added:') 236 | print('####################################') 237 | print('') 238 | for f in list(csum_results['added']): 239 | paf.prAdded(f) 240 | print('') 241 | 242 | if paf.yn_frame('Do You Want to Remove ' + str(len(csum_results['added'])) + ' Files That Have Been ADDED?') is True: 243 | for f in track(csum_results['added'], description='Removing New Files'): 244 | os.remove(f) 245 | paf.write_to_log(fname, 'Removed New Files', config['log']) 246 | else: 247 | paf.write_to_log(fname, 'User Declined Removing New Files', config['log']) 248 | 249 | paf.prSuccess('Done Restoring Files!') 250 | paf.write_to_log(fname, 'Done Restoring Files', config['log']) 251 | 252 | 253 | ############################### 254 | # Restore Custom Directories 255 | ############################# 256 | 257 | def restore(config, info, dir_list, checksum): 258 | ''' 259 | This is the main 'api' entrance point for file restoration. 260 | This function orchestrates the process handing of work to other funcs. 261 | ''' 262 | fname = 'custom_dirs.restore()' 263 | unpack_path = info['tar'][:-4] 264 | p_len = len(unpack_path) 265 | paf.write_to_log(fname, 'PLACE HOLDER', config['log']) 266 | 267 | # Decompress Tar 268 | if os.path.exists(info['tar.gz']): 269 | paf.prWarning('Decompressing Custom Tar....') 270 | if any(re.findall('pigz', line.lower()) for line in utils.pacman_Q()): 271 | os.system('/usr/bin/pigz -d ' + info['tar.gz'] + ' -f') 272 | paf.write_to_log(fname, 'Decompressed Tar With Pigz', config['log']) 273 | else: 274 | paf.gz_d(info['tar.gz']) 275 | paf.write_to_log(fname, 'Decompressed Tar With Python', config['log']) 276 | 277 | # Check Tar Csum And Unpack 278 | if os.path.exists(info['tar']): 279 | # Checksum Tar 280 | print('Checking Integrity of Tar...') 281 | tar_csum = paf.checksum_file(info['tar'])[1] 282 | paf.write_to_log(fname, 'Checksummed Tar', config['log']) 283 | 284 | if tar_csum == checksum: 285 | paf.write_to_log(fname, 'Tar Passed Checksum Integrity Check', config['log']) 286 | paf.prSuccess('Tar Passed Integrity Check') 287 | else: 288 | paf.write_to_log(fname, 'Custom Tar Failed Integrity Check!', config['log']) 289 | paf.prError('Custom Tar Failed Integrity Check!') 290 | paf.prBold('Skipping Custom File Restoration!') 291 | return 292 | 293 | # Clean Then Unpack Tar 294 | paf.prWarning('Unpacking Files from Tar....') 295 | paf.rm_dir(unpack_path, sudo=True) 296 | paf.untar_dir(info['tar']) 297 | paf.write_to_log(fname, 'Unpacked Custom Files From Tar', config['log']) 298 | 299 | else: 300 | # Skip If Tar is Missing 301 | paf.write_to_log(fname, 'Meta Data File Spesifies A Tar That is Now Missing!', config['log']) 302 | paf.prError('This Restore Point is Missing It\'s Custom Tar!') 303 | return 304 | 305 | if paf.yn_frame('Do You Want to Compare Restore Point Files Against Your Current File System?') is True: 306 | results = compare_files(config, dir_list, unpack_path, p_len) 307 | # Exit If No Changes Made to Files 308 | if len(results['added']) + len(results['removed']) + len(results['changed']) == 0: 309 | paf.write_to_log(fname, 'Checksum Returned 0 Changed, Removed or Added Files', config['log']) 310 | paf.prSuccess('No Changes Have Been Made to Your File System!') 311 | else: 312 | smart_overwrite(config, results, unpack_path, p_len) 313 | 314 | else: 315 | force_overwrite(config, unpack_path, p_len) 316 | 317 | # Cleanup After Runtime 318 | repack(config, info, unpack_path) 319 | 320 | 321 | ############################# 322 | # Store Custom Directories 323 | ########################### 324 | 325 | def store(config, info): 326 | ''' 327 | Packs up user defined directories. 328 | ''' 329 | fname = 'custom_dirs.pack()' 330 | paf.write_to_log(fname, str(len(info['dir_list'])) + ' Folders Selected For Storage', config['log']) 331 | tmpfile = tempfile.gettempdir() + '/folder_permissions.pickle' 332 | 333 | # Fetch Folder Permissions and Pickle 334 | folder_perms = set() 335 | for d in info['dir_list']: 336 | folder_perms.update(paf.get_permissions(d, 'folders')) 337 | pickle.dump(folder_perms, (open(tmpfile, 'wb'))) 338 | # Scan For Files 339 | files = paf.find_files(info['dir_list']) 340 | 341 | # Pack Custom Files Into Tar 342 | with tarfile.open(info['tar'], 'w') as tar: 343 | tar.add(tmpfile, arcname='folder_permissions.pickle') 344 | for f in track(files, description='Adding Files to Tar'): 345 | tar.add(f) 346 | paf.rm_file(tmpfile, sudo=False) 347 | 348 | paf.write_to_log(fname, 'Created ' + info['tar'], config['log']) 349 | 350 | # Create Checksum for Tar 351 | print('Creating Checksum...') 352 | pack_csum = paf.checksum_file(info['tar'])[1] 353 | paf.write_to_log(fname, 'Checksummed Tar ', config['log']) 354 | 355 | # Compresses Custom Tar 356 | print('Compressing Custom Tar...') 357 | if any(re.findall('pigz', l.lower()) for l in utils.pacman_Q()): 358 | os.system('/usr/bin/pigz ' + info['tar'] + ' -f') 359 | else: 360 | paf.gz_c(info['tar'], rm=True) 361 | paf.write_to_log(fname, 'Compressed ' + info['tar'], config['log']) 362 | 363 | pack_results = { 364 | 'file_count': len(files), 365 | 'raw_size': paf.convert_size(paf.size_of_files(files)), 366 | 'compressed_size': paf.convert_size(os.path.getsize(info['tar'] + '.gz')), 367 | 'csum': pack_csum 368 | } 369 | 370 | return pack_results 371 | -------------------------------------------------------------------------------- /core/error.py: -------------------------------------------------------------------------------- 1 | #! /usr/bin/env python3 2 | import re 3 | 4 | # Local Modules 5 | import paf 6 | import utils 7 | 8 | 9 | def too_many_pkgs_found(config, parms, found_pkgs, pkg_results): 10 | """ 11 | This auto resolves some very bizzare edgecases I have run into. 12 | """ 13 | fname = 'error.too_many_pkgs_found(' + parms['type'] + parms['id'] + ')' 14 | paf.write_to_log(fname, 'Starting Debug Proccess...', config['log']) 15 | 16 | found_files = utils.trim_pkg_list(paf.basenames(found_pkgs)) 17 | search_files = paf.basenames(pkg_results['search']) 18 | bad_files = (found_files - search_files) 19 | paf.write_to_log(fname, 'Debug Proccess Found ' + str(len(bad_files)) + ' Files That Do Not Belong!', config['log']) 20 | 21 | if len(found_files) - len(search_files) == len(bad_files): 22 | paf.write_to_log(fname, 'Cleaning Found Files...', config['log']) 23 | bad_files_full = set() 24 | 25 | for b in bad_files: 26 | for f in found_pkgs: 27 | if re.search(b, f): 28 | bad_files_full.add(f) 29 | 30 | for f in bad_files_full: 31 | found_pkgs.remove(f) 32 | 33 | paf.write_to_log(fname, 'Debug Process Was Able to Fix All Issues!', config['log']) 34 | return (True, found_pkgs) 35 | 36 | else: 37 | paf.write_to_log(fname, 'Debug Process Was NOT Able to Fix All Issues!', config['log']) 38 | return (False, found_pkgs) 39 | -------------------------------------------------------------------------------- /core/meta.py: -------------------------------------------------------------------------------- 1 | #! /usr/bin/env python3 2 | import re 3 | import os 4 | 5 | # Local Modules 6 | import paf 7 | import utils 8 | import session 9 | 10 | 11 | def find_in(meta_data, key): 12 | ''' 13 | Fetches the value for a 'key: value' entry in the meta data. 14 | ''' 15 | for m in meta_data: 16 | if m.split(':')[0].lower() == key.lower(): 17 | value = ':'.join(m.split(':')[1:]).strip() 18 | return value 19 | return 'None' 20 | 21 | 22 | ###################################### 23 | # Read A Meta Data File Into A Dict 24 | #################################### 25 | 26 | def read(config, meta_path): 27 | ''' 28 | Reads the raw human readable meta file into a python dictionary. 29 | Uses meta.find_in() and paf.read_between() to the find values. 30 | ''' 31 | meta_raw = paf.read_file(meta_path) 32 | 33 | meta_dict = { 34 | 'version': find_in(meta_raw, 'Version'), 35 | 'type': find_in(meta_raw, 'Type'), 36 | 'stype': find_in(meta_raw, 'SubType'), 37 | 'date': find_in(meta_raw, 'Date Created'), 38 | 'time': find_in(meta_raw, 'Time Created'), 39 | 'pkgs_installed': find_in(meta_raw, 'Packages Installed'), 40 | 'pkg_list': set(paf.read_between(' Pacman List ', '', meta_raw, re_flag=True)), 41 | 'label': find_in(meta_raw, 'Label'), 42 | 43 | # Full Values 44 | 'pkgs_cached': find_in(meta_raw, 'Packages Cached'), 45 | 'cache_size': find_in(meta_raw, 'Package Cache Size'), 46 | 'dir_list': set(paf.read_between(' Dir List ', ' Pacman List ', meta_raw, re_flag=True)[:-1]), 47 | 'file_count': find_in(meta_raw, 'Dir File Count'), 48 | 'file_raw_size': find_in(meta_raw, 'Dir Raw Size'), 49 | 'tar_size': find_in(meta_raw, 'Tar Compressed Size'), 50 | 'tar_csum': find_in(meta_raw, 'Tar Checksum')} 51 | 52 | return meta_dict 53 | 54 | 55 | ############################## 56 | # Compare Two Package Lists 57 | ############################ 58 | 59 | def compare(config, old_pkgs, new_pkgs): 60 | ''' 61 | Compares two lists of packages and returns a dictionary containing 62 | changed, added, and removed packages. Also returns a formated list 63 | for searching with utils.search_cache(). 64 | ''' 65 | 66 | # Strips the Version 67 | old_pkg_strp = {pkg.split(' ')[0] for pkg in old_pkgs} 68 | current_pkg_strp = {pkg.split(' ')[0] for pkg in new_pkgs} 69 | added_pkgs = set(current_pkg_strp - old_pkg_strp) 70 | removed_pkgs = set(old_pkg_strp - current_pkg_strp) 71 | 72 | # Clears Removed Packages From changed_pkgs 73 | c_plus_r = set(old_pkgs - new_pkgs) 74 | search = ('|'.join(list(re.escape(pkg) for pkg in removed_pkgs))) 75 | changed_pkgs = set() 76 | for pkg in c_plus_r: 77 | if not re.findall(search, pkg.lower()): 78 | changed_pkgs.add(pkg) 79 | 80 | results = { 81 | 'search': paf.replace_spaces(c_plus_r, '-'), 82 | 'c_pkgs': changed_pkgs, 83 | 'a_pkgs': added_pkgs, 84 | 'r_pkgs': removed_pkgs} 85 | 86 | return results 87 | 88 | 89 | ######################################### 90 | # Checksum And Validate Meta Integrity 91 | ####################################### 92 | 93 | def validate(config, info): 94 | ''' 95 | Checks if a meta file has become corrupted or is missing. 96 | ''' 97 | fname = 'meta.validate(' + info['type'] + info['id'] + ')' 98 | 99 | if os.path.exists(info['meta']) and os.path.exists(info['meta_md5']): 100 | paf.write_to_log(fname, 'Meta File and Meta Checksum Are Present', config['log']) 101 | csum = str(open(info['meta_md5']).read()).strip() 102 | msum = str(paf.checksum_file(info['meta'])[1]).strip() 103 | 104 | if csum == msum: 105 | paf.write_to_log(fname, 'Meta Passed Checksum', config['log']) 106 | return 107 | 108 | elif csum != msum: 109 | paf.write_to_log(fname, 'Meta Checksum FAILED!', config['log']) 110 | paf.prError(info['TYPE'] + ' ' + info['id'] + ' Has Failed its Checksum Check!') 111 | paf.prError('This ' + info['TYPE'] + ' Has Likely Become Corrupt!') 112 | 113 | if paf.yn_frame('Do You Want to Remove This ' + info['TYPE'] + ' Now?') is True: 114 | utils.remove_id(config, info) 115 | session.abort(fname, 'User Deleted Corrupted ' + info['TYPE'], 116 | info['TYPE'] + ' Was Removed. Exiting Now!', config) 117 | else: 118 | session.abort(fname, 'User Choose NOT to Remove Corrupted ' + info['TYPE'], 119 | 'Okay, Leaving the ' + info['TYPE'] + ' Alone. Exiting Now!', config) 120 | 121 | elif os.path.exists(info['meta']) and not os.path.exists(info['meta_md5']): 122 | paf.write_to_log(fname, 'Meta File is Missing its Checksum File!', config['log']) 123 | paf.prError(info['TYPE'] + ' ' + info['id'] + ' is Missing a Checksum!') 124 | 125 | if paf.yn_frame('Do You Still Want To Continue?') is False: 126 | session.abort(fname, 'User Exited Due to Missing Checksum File', 127 | 'Okay, Aborting Due to Missing Checksum', config) 128 | else: 129 | paf.write_to_log(fname, 'User Choose To Continue Even Though The Checksum is Missing', config['log']) 130 | return 131 | 132 | 133 | ############################# 134 | # Higher Level 'API' Calls 135 | ########################### 136 | 137 | def compare_meta(config, old_meta, new_meta): 138 | return compare(config, old_meta['pkg_list'], new_meta['pkg_list']) 139 | 140 | 141 | def compare_now(config, old_meta): 142 | return compare(config, old_meta['pkg_list'], utils.pacman_Q()) 143 | -------------------------------------------------------------------------------- /core/pacback.py: -------------------------------------------------------------------------------- 1 | #! /usr/bin/env python3 2 | import re 3 | import signal 4 | import argparse 5 | from functools import partial 6 | 7 | # Local Modules 8 | import paf 9 | import user 10 | import utils 11 | import create 12 | import restore 13 | import session 14 | 15 | 16 | ################## 17 | # CLI Arguments 18 | ################ 19 | 20 | parser = argparse.ArgumentParser(description="A package rollback utility for Arch Linux.") 21 | 22 | # Creation 23 | parser.add_argument("-c", "--create_rp", metavar=('1'), 24 | help="Generate a pacback restore point.") 25 | parser.add_argument("--hook", action='store_true', 26 | help="Used exlusively by the pacback hook to create snapshots.") 27 | 28 | # Restoration 29 | parser.add_argument("-rp", "--restore_point", metavar=('1'), 30 | help="Rollback to a restore point.") 31 | parser.add_argument("-ss", "--snapshot", metavar=('1'), 32 | help="Rollback to a snapshot.") 33 | parser.add_argument("-pkg", "--package", nargs='*', default=[], metavar=('pkg_name'), 34 | help="Rollback a list of packages.") 35 | parser.add_argument("-dt", "--date", metavar=('2020/06/23'), 36 | help="Rollback to a date in the Arch Archive.") 37 | 38 | # Optional Arguments 39 | parser.add_argument("-f", "--full_rp", action='store_true', 40 | help="Create full restore point.") 41 | parser.add_argument("-d", "--add_dir", nargs='*', default=[], metavar=('/path/here'), 42 | help="Add custom directories to your restore point when using `--full_rp`.") 43 | parser.add_argument("-nc", "--no_confirm", action='store_true', 44 | help="Skip asking user questions. Will typically answer yes to all.") 45 | parser.add_argument("-l", "--label", metavar=('Label Name'), 46 | help="Tag your restore point with a label.") 47 | 48 | # Utils 49 | parser.add_argument("-ih", "--install_hook", action='store_true', 50 | help="Install a pacman hook that creates snapshots.") 51 | parser.add_argument("-rh", "--remove_hook", action='store_true', 52 | help="Remove the pacman hook that creates snapshots.") 53 | parser.add_argument("-cl", "--clean", action='store_true', 54 | help="Clean old packages, orphaned packages, and old restore points.") 55 | parser.add_argument("-rm", "--remove", metavar=('2'), 56 | help="Removes the selected restore point.") 57 | 58 | # Show Info 59 | parser.add_argument("-v", "--version", action='store_true', 60 | help="Display pacback version and cache info.") 61 | parser.add_argument("-i", "--info", metavar=('1'), 62 | help="Print information about a restore point.") 63 | parser.add_argument("-df", "--diff", nargs=2, metavar=('1 2'), 64 | help="Compare any two restore points or snapshots.") 65 | parser.add_argument("-ls", "--list", action='store_true', 66 | help="List information about all existing restore points and snapshots.") 67 | parser.add_argument("-cache", "--cache_size", action='store_true', 68 | help="Calculate reported and actual cache sizes.") 69 | 70 | args = parser.parse_args() 71 | config = session.load_config() 72 | 73 | ########################## 74 | # Display Info For User 75 | ######################## 76 | 77 | if args.version: 78 | print('Pacback Version: ' + config['version']) 79 | print('PAF Version: ' + config['paf']) 80 | 81 | if args.info: 82 | if re.findall(r'^(rp[0-9][0-9]$|rp[0-9]$|ss[0-9][0-9]$|ss[0-9])$', args.info): 83 | user.print_info(config, args.info) 84 | else: 85 | paf.prError('Invalid Input: Argument Must Specify Type and Number! (IE: rp02 or ss4)') 86 | 87 | if args.list: 88 | user.list_all(config) 89 | 90 | if args.diff: 91 | if all(re.search(r'^(rp[0-9][0-9]$|rp[0-9]$|ss[0-9][0-9]$|ss[0-9])$', d) for d in args.diff): 92 | user.diff_meta(config, args.diff[0], args.diff[1]) 93 | else: 94 | paf.prError('Invalid Input: Argument Must Specify Type and Number! (IE: rp02 or ss4)') 95 | 96 | if args.create_rp or args.hook or args.package or args.snapshot or args.restore_point or args.date\ 97 | or args.remove or args.clean or args.install_hook or args.remove_hook or args.cache_size: 98 | 99 | # Safely Init the Environment 100 | session.lock(config) 101 | signal.signal(signal.SIGINT, partial(session.sig_catcher, config)) 102 | 103 | ###################### 104 | # Creation Commands 105 | #################### 106 | 107 | if args.create_rp: 108 | if re.match(r'^([0-9]|0[1-9]|[0-9][0-9])$', args.create_rp): 109 | create.restore_point(config, args.create_rp, args.full_rp, args.add_dir, args.no_confirm, args.label) 110 | else: 111 | paf.prError('Invalid Input: Argument Must Be An Integer Between 0-99!') 112 | 113 | elif args.hook: 114 | create.snapshot(config, args.label) 115 | 116 | ######################### 117 | # Restoration Commands 118 | ####################### 119 | 120 | if args.package: 121 | if not all(re.search(r'^(s*[0-9])', pkg) for pkg in args.package): 122 | restore.packages(config, args.package) 123 | else: 124 | paf.prError('Invalid Input: Package Names Should NOT Start With Digits!') 125 | 126 | elif args.snapshot: 127 | if re.match(r'^([0-9]|0[1-9]|[0-9][0-9])$', args.snapshot): 128 | restore.snapshot(config, args.snapshot) 129 | else: 130 | paf.prError('Invalid Input: Argument Must Be An Integer Between 0-99!') 131 | 132 | elif args.restore_point: 133 | if re.match(r'^([0-9]|0[1-9]|[0-9][0-9])$', args.restore_point): 134 | restore.restore_point(config, args.restore_point) 135 | else: 136 | paf.prError('Invalid Input: Argument Must Be An Integer Between 0-99!') 137 | 138 | elif args.date: 139 | if re.match(r'([12]\d{3}/(0[1-9]|1[0-2])/(0[1-9]|[12]\d|3[01]))', args.date): 140 | restore.archive_date(config, args.date) 141 | else: 142 | paf.prError('Invalid Input: Date Must Be in YYYY/MM/DD Format!') 143 | 144 | ##################### 145 | # Pacback Utilities 146 | ################### 147 | 148 | if args.cache_size: 149 | cache = utils.cache_size(config) 150 | print('Unique Package Versions: ' + cache[0]) 151 | print('Pacman Cache Size: ' + cache[1]) 152 | print('User(s) Cache Size: ' + cache[2]) 153 | print('Pacback Cache Size: ' + cache[3]) 154 | print('Reported Total Cache Size: ' + cache[4]) 155 | 156 | if args.remove: 157 | if re.findall(r'^([0-9]|0[1-9]|[0-9][0-9])$', args.remove): 158 | user.remove_rp(config, args.remove, args.no_confirm) 159 | else: 160 | paf.prError('Invalid Input: Argument Must Be An Integer Between 0-99!') 161 | 162 | elif args.clean: 163 | user.clean_cache(config, args.no_confirm) 164 | 165 | elif args.install_hook: 166 | utils.pacman_hook(True, config) 167 | 168 | elif args.remove_hook: 169 | utils.pacman_hook(False, config) 170 | 171 | # Safely Close the Environment 172 | session.unlock(config) 173 | -------------------------------------------------------------------------------- /core/restore.py: -------------------------------------------------------------------------------- 1 | #! /usr/bin/env python3 2 | import re 3 | import os 4 | 5 | # Local Modules 6 | import paf 7 | import meta 8 | import error 9 | import utils 10 | import session 11 | import version 12 | import custom_dirs 13 | 14 | 15 | ############################# 16 | # Main Package Restoration 17 | ########################### 18 | 19 | def main(config, parms, pkg_results): 20 | ''' 21 | This is the main restore logic for pacback. It should NOT be called directly but 22 | instead called through a higher level 'API' like call. 23 | This logic does the actual work of downgrading, removing, and installing packages. 24 | ''' 25 | fname = 'restore.main(' + parms['type'] + parms['id'] + ')' 26 | 27 | # Branch if Packages Have Been Changed or Removed 28 | if pkg_results['search']: 29 | cache = utils.scan_caches(config) 30 | found_pkgs = utils.search_cache(pkg_results['search'], cache, config) 31 | 32 | # This is Very Bad 33 | if len(found_pkgs) > len(pkg_results['search']): 34 | paf.prError('Error: Somehow More Packages Were Found Than Were Searched For!') 35 | paf.write_to_log(fname, 'Error: Somehow More Packages Were Found Than Were Searched For!', config['log']) 36 | print('Starting Error Resolving Process...') 37 | error_handler_results = error.too_many_pkgs_found(config, parms, found_pkgs, pkg_results) 38 | 39 | if error_handler_results[0] is True: 40 | paf.prSuccess('Pacback Was Able To Automaticly Resolve This Error!') 41 | found_pkgs = error_handler_results[1] 42 | else: 43 | paf.prError('Pacback Was NOT Able To Automaticly Resolve This Error!') 44 | error.create_error_report() 45 | 46 | # Branch if Packages are Missing 47 | elif len(found_pkgs) < len(pkg_results['search']): 48 | missing_pkg = set(pkg_results['search'] - utils.trim_pkg_list(found_pkgs)) 49 | paf.write_to_log(fname, str(len(found_pkgs)) + ' Out of ' + str(len(pkg_results['search'])) + ' Packages Found', config['log']) 50 | 51 | paf.prWarning('Couldn\'t Find The Following Package Versions:') 52 | for pkg in missing_pkg: 53 | paf.prError(pkg) 54 | if paf.yn_frame('Do You Want To Continue Anyway?') is False: 55 | session.abort_fail(fname, 'User Aborted Rollback Because of Missing Packages', 'Aborting Rollback!', config) 56 | 57 | # This is the Best Case 58 | else: 59 | paf.prSuccess('All Packages Found In Your Local File System!') 60 | paf.write_to_log(fname, 'Found All Changed and Removed Packages', config['log']) 61 | 62 | print(str(len(found_pkgs))) 63 | paf.pacman(' '.join(found_pkgs), '-U') 64 | paf.write_to_log(fname, 'Sent Pacman Selected Packages', config['log']) 65 | 66 | else: 67 | paf.prSuccess('No Packages Have Been Changed or Removed!') 68 | paf.write_to_log(fname, 'No Packages Have Been Changed or Removed', config['log']) 69 | 70 | # Branch if Packages Have Been Added 71 | if pkg_results['a_pkgs']: 72 | print('') 73 | paf.write_to_log(fname, str(len(pkg_results['a_pkgs'])) + ' Have Been Added Since Creation', config['log']) 74 | 75 | paf.prWarning(str(len(pkg_results['a_pkgs'])) + ' Packages Have Been Added Since Creation') 76 | for pkg in pkg_results['a_pkgs']: 77 | paf.prAdded(pkg) 78 | print('') 79 | if paf.yn_frame('Do You Want to Remove These Packages From Your System?') is True: 80 | print('') 81 | paf.pacman(' '.join(pkg_results['a_pkgs']), '-R') 82 | paf.write_to_log(fname, 'Sent Added Packages To `pacman -R`', config['log']) 83 | 84 | else: 85 | paf.prSuccess('No Packages Have Been Added!') 86 | paf.write_to_log(fname, 'No Packages Have Been Added', config['log']) 87 | 88 | 89 | ##################### 90 | # Restore Snapshot 91 | ################### 92 | 93 | def snapshot(config, id_num): 94 | ''' 95 | This handles the process of restoring snapshots. This is pretty much the same as a 96 | standard restore point but requires post-processing after the restoration to maintain 97 | the order of changes made to the system. 98 | ''' 99 | id_num = str(id_num).zfill(2) 100 | fname = 'restore.snapshot(' + id_num + ')' 101 | paf.write_to_log(fname, 'Started Restoring Snapshot ID:' + id_num, config['log']) 102 | 103 | info = { 104 | 'id': id_num, 105 | 'type': 'ss', 106 | 'TYPE': 'Snapshot', 107 | 'meta': config['ss_paths'] + '/ss' + id_num + '.meta', 108 | 'meta_md5': config['ss_paths'] + '/.ss' + id_num + '.md5', 109 | 'path': config['ss_paths'] + '/ss' + id_num, 110 | 'pkgcache': config['ss_paths'] + '/ss' + id_num + '/pkg-cache' 111 | } 112 | 113 | # Read Meta Data File, Check Version, Compare Results, Restore 114 | meta.validate(config, info) 115 | ss_dict = meta.read(config, info['meta']) 116 | version.compare(config, ss_dict['version']) 117 | main(config, info, meta.compare_now(config, ss_dict)) 118 | 119 | # Resets Order So The Restored Version is Zero 120 | paf.write_to_log(fname, 'Started Rewinding Snapshots Back to Zero', config['log']) 121 | 122 | # Removes Snapshots From Zero to Restored Snapshot ID 123 | for n in range(0, int(info['id'])): 124 | rm_info = { 125 | 'id': str(n).zfill(2), 126 | 'type': 'ss', 127 | 'TYPE': 'Snapshot', 128 | 'meta': config['ss_paths'] + '/ss' + str(n).zfill(2) + '.meta', 129 | 'meta_md5': config['ss_paths'] + '/.ss' + str(n).zfill(2) + '.md5' 130 | } 131 | utils.remove_id(config, rm_info) 132 | 133 | # Shifts Snapshots Back, So Now Retored Snapshot Is New Zero 134 | id_counter = 0 135 | for n in range(int(info['id']), (config['max_ss'] + 1)): 136 | meta_path_old = config['ss_paths'] + '/ss' + str(n).zfill(2) + '.meta' 137 | meta_path_new = config['ss_paths'] + '/ss' + str(id_counter).zfill(2) + '.meta' 138 | hash_path_old = config['ss_paths'] + '/.ss' + str(n).zfill(2) + '.md5' 139 | hash_path_new = config['ss_paths'] + '/.ss' + str(id_counter).zfill(2) + '.md5' 140 | meta_found = os.path.exists(meta_path_old) 141 | csum_found = os.path.exists(hash_path_old) 142 | 143 | if meta_found and csum_found: 144 | os.rename(meta_path_old, meta_path_new) 145 | os.rename(hash_path_old, hash_path_new) 146 | id_counter += 1 147 | elif meta_found and not csum_found: 148 | paf.write_to_log(fname, 'Snapshot ' + str(n).zfill(2) + ' is Missing it\'s Checksum File!', config['log']) 149 | paf.rm_file(meta_path_old, sudo=False) 150 | paf.write_to_log(fname, 'Removed Snapshot ID:' + str(n).zfill(2), config['log']) 151 | elif not meta_found and csum_found: 152 | paf.write_to_log(fname, hash_path_old + ' is an Orphaned Checksum', config['log']) 153 | paf.rm_file(hash_path_old, sudo=False) 154 | paf.write_to_log(fname, 'Removed Orphaned Checksum', config['log']) 155 | else: 156 | pass 157 | 158 | paf.write_to_log(fname, 'Finished Rewinding Snapshots Back to Zero', config['log']) 159 | 160 | # Finish Last Checks and Exit 161 | utils.reboot_check(config) 162 | paf.write_to_log(fname, 'Finished Restoring Snapshot ID:' + id_num, config['log']) 163 | 164 | 165 | ############################ 166 | # Restore A Restore Point 167 | ########################## 168 | 169 | def restore_point(config, id_num): 170 | ''' 171 | This preps the system for a restoration then hands off to restore.main() 172 | ''' 173 | id_num = str(id_num).zfill(2) 174 | fname = 'restore.restore_point(' + id_num + ')' 175 | paf.write_to_log(fname, 'Started Restoring Restore Point ID:' + id_num, config['log']) 176 | 177 | info = { 178 | 'id': id_num, 179 | 'type': 'rp', 180 | 'TYPE': 'Restore Point', 181 | 'meta': config['rp_paths'] + '/rp' + id_num + '.meta', 182 | 'meta_md5': config['rp_paths'] + '/.rp' + id_num + '.md5', 183 | 'path': config['rp_paths'] + '/rp' + id_num, 184 | 'pkgcache': config['rp_paths'] + '/rp' + id_num + '/pkg-cache', 185 | 'tar': config['rp_paths'] + '/rp' + id_num + '/rp' + id_num + '_dirs.tar', 186 | 'tar.gz': config['rp_paths'] + '/rp' + id_num + '/rp' + id_num + '_dirs.tar.gz' 187 | } 188 | 189 | # Read Meta File, Check Version, Compare Results 190 | meta.validate(config, info) 191 | rp_dict = meta.read(config, info['meta']) 192 | version.compare(config, rp_dict['version']) 193 | main(config, info, meta.compare_now(config, rp_dict)) 194 | 195 | # Unpack and Compare Directories Stored By User 196 | if rp_dict['dir_list']: 197 | custom_dirs.restore(config, info, rp_dict['dir_list'], rp_dict['tar_csum']) 198 | 199 | # Finish Last Checks and Exit 200 | utils.reboot_check(config) 201 | paf.write_to_log(fname, 'Finished Restoreing Restore Point ID:' + id_num, config['log']) 202 | 203 | 204 | ################################ 205 | # Manual User Package Restore 206 | ############################## 207 | 208 | def packages(config, pkgs): 209 | ''' 210 | Allows the user to rollback packages by name. 211 | Packages are not sent to pacman until the user has 212 | selected all the packages they want to restore/change. 213 | ''' 214 | # Startup 215 | fname = 'restore.packages(' + str(len(pkgs)) + ')' 216 | pkg_paths = list() 217 | cache = utils.scan_caches(config) 218 | 219 | # Search For Each Package Name And Let User Select Version 220 | paf.write_to_log(fname, 'Started Search for ' + ', '.join(pkgs), config['log']) 221 | for pkg in pkgs: 222 | found_pkgs = utils.user_pkg_search(pkg, cache) 223 | sort_pkgs = sorted(found_pkgs, reverse=True) 224 | 225 | if found_pkgs: 226 | paf.write_to_log(fname, 'Found ' + str(len(found_pkgs)) + ' Cached Versions for `' + pkg + '`', config['log']) 227 | paf.prSuccess('Pacback Found the Following Versions for `' + pkg + '`:') 228 | answer = paf.multi_choice_frame(sort_pkgs) 229 | 230 | # Lets User Abort Package Selection 231 | if answer is False or None: 232 | paf.write_to_log(fname, 'User Selected NOTHING For ' + pkg, config['log']) 233 | else: 234 | for x in cache: 235 | if re.findall(re.escape(answer), x): 236 | pkg_paths.append(x) 237 | break 238 | 239 | else: 240 | paf.prError('No Packages Found Under the Name: ' + pkg) 241 | paf.write_to_log(fname, 'Search for ' + pkg.upper() + ' Returned ZERO Results!', config['log']) 242 | 243 | if pkg_paths: 244 | paf.pacman(' '.join(pkg_paths), '-U') 245 | paf.write_to_log(fname, 'Sent Pacman Selected Packages For Installation', config['log']) 246 | else: 247 | paf.write_to_log(fname, 'User Selected No Packages or No Packages Were Found', config['log']) 248 | 249 | 250 | ############################# 251 | # Restore Packages to Date 252 | ########################### 253 | 254 | def archive_date(config, date): 255 | ''' 256 | This function simply automates the date rollback instructions found on the Arch Wiki. 257 | https://wiki.archlinux.org/index.php/Arch_Linux_Archive#How_to_restore_all_packages_to_a_specific_date 258 | ''' 259 | # Startup 260 | fname = 'restore.archive_date(' + str(date) + ')' 261 | mirror = '/etc/pacman.d/mirrorlist' 262 | 263 | # Done as a Fail Safe 264 | if len(paf.read_file(mirror)) > 2: 265 | os.system('mv ' + mirror + ' ' + mirror + '.pacback') 266 | paf.write_to_log(fname, 'Backed Up Existing Mirrorlist', config['log']) 267 | else: 268 | paf.write_to_log(fname, 'Skipped Mirrorlist Backup. File Seems Miss-Formated!', config['log']) 269 | 270 | paf.export_iterable(mirror, ['## Set By Pacback', 'Server=https://archive.archlinux.org/repos/' + date + '/$repo/os/$arch']) 271 | paf.write_to_log(fname, 'Added ' + date + ' Archive URL To Mirrorlist', config['log']) 272 | 273 | # Run Pacman Update to Run Downgrade 274 | os.system('/usr/bin/pacman -Syyuu') 275 | paf.write_to_log(fname, 'Sent -Syyuu to Pacman', config['log']) 276 | 277 | # Restore the Non-Archive URL Mirrorlist 278 | if os.path.exists(mirror + '.pacback') is False: 279 | paf.write_to_log(fname, 'Backup Mirrorlist Is Missing', config['log']) 280 | if paf.yn_frame('Missing Mirrorlist! Do You Want to Fetch a New HTTPS Mirrorlist?') is True: 281 | if utils.fetch_new_mirrorlist() is True: 282 | paf.write_to_log(fname, 'A New Mirrorlist Was Successfully Downloaded', config['log']) 283 | else: 284 | session.abort_fail(fname, 'User Declined Country Selection!', 285 | 'Please Manually Replace Your Mirrorlist!', config['log']) 286 | else: 287 | session.abort_fail(fname, 'Backup Mirrorlist Is Missing and User Declined Download!', 288 | 'Please Manually Replace Your Mirrorlist!', config['log']) 289 | else: 290 | os.system('mv ' + mirror + '.pacback ' + mirror) 291 | paf.write_to_log(fname, 'Backup Mirrorlist Was Restored Successfully', config['log']) 292 | print('Refreshing Pacman Database...') 293 | os.system('/usr/bin/pacman -Sy > /dev/null') 294 | paf.write_to_log(fname, 'Updated Pacman Database After Restoring Mirrorlist', config['log']) 295 | -------------------------------------------------------------------------------- /core/session.py: -------------------------------------------------------------------------------- 1 | #! /usr/bin/env python3 2 | import os 3 | import sys 4 | import fcntl 5 | import datetime as dt 6 | 7 | # Local Modules 8 | import paf 9 | 10 | 11 | ####################### 12 | # Session Management 13 | ##################### 14 | 15 | def lock(config): 16 | ''' 17 | This checks if pacback is being run by root or sudo, 18 | then checks if an active session is already in progress. 19 | ''' 20 | fname = 'session.lock()' 21 | if paf.am_i_root() is False: 22 | sys.exit('Critical Error: This Command Must Be Run As Root!') 23 | 24 | try: 25 | lock = os.open(config['slock'], os.O_CREAT) 26 | fcntl.flock(lock, fcntl.LOCK_EX | fcntl.LOCK_NB) 27 | paf.start_log(fname, config['log']) 28 | paf.write_to_log(fname, 'Passed Root Check', config['log']) 29 | paf.write_to_log(fname, 'Started Active Session', config['log']) 30 | if os.path.exists('/etc/pacback.conf') is False: 31 | paf.write_to_log(fname, 'User Config File Is Missing!', config['log']) 32 | 33 | except (IOError, OSError): 34 | sys.exit('Critical Error! Pacback Already Has An Active Session Running.') 35 | 36 | 37 | def unlock(config): 38 | ''' 39 | Removes the session lock file defined by config['slock']. 40 | This will release the lock that was created calling session.lock() 41 | ''' 42 | fname = 'session.unlock()' 43 | paf.write_to_log(fname, 'Ended Active Session', config['log']) 44 | paf.end_log(fname, config['log'], config['log_length']) 45 | 46 | 47 | def abort_fail(func, output, message, config): 48 | ''' 49 | This is a surrogate function for other functions to safely abort runtime during a failure. 50 | It reports the func sending the kill request as the origin, rather than session.abort(). 51 | ''' 52 | paf.write_to_log(func, 'FAILURE: ' + output, config['log']) 53 | unlock(config) 54 | paf.prError(message) 55 | sys.exit() 56 | 57 | 58 | def abort(func, output, message, config): 59 | ''' 60 | This is a surrogate function for other functions to safely abort runtime WITHOUT reporting 61 | as an internal error. This is useful for non-critical issues that still require a runtime exit. 62 | It reports the func sending the kill signal as the origin, rather than session.abort(). 63 | ''' 64 | paf.write_to_log(func, 'ABORT: ' + output, config['log']) 65 | unlock(config) 66 | paf.prBold(message) 67 | sys.exit(0) 68 | 69 | 70 | def sig_catcher(log, signum, frame): 71 | ''' 72 | This is called whenever a exit signal is received(AKA: KeyboardInterupt). 73 | It lets pacback exit somewhat safely during the event of a kill. 74 | ''' 75 | abort_fail('SIGINT', 'Caught SIGINT ' + str(signum), '\nAttempting Clean Exit', log) 76 | 77 | 78 | ############################# 79 | # Snapshot Hook Management 80 | ########################### 81 | 82 | def hlock_start(config): 83 | ''' 84 | This starts a hook lock overwriting the previous lock. 85 | This should be triggered at the end of a successful `--hook` run. 86 | ''' 87 | fname = 'session.hlock_start(' + str(config['hook_cooldown']) + ')' 88 | stime = 'Created: ' + dt.datetime.now().strftime("%Y:%m:%d:%H:%M:%S"), 89 | paf.export_iterable(config['hlock'], [stime]) 90 | paf.write_to_log(fname, 'Created Hook Lock With ' + str(config['hook_cooldown']) + ' Second Cooldown', config['log']) 91 | 92 | 93 | def hlock_kill(config): 94 | ''' 95 | Removes the hook lock file without any checks. 96 | This currently isn't used anywhere, it's just future-proofing. 97 | ''' 98 | fname = 'session.hlock_kill()' 99 | paf.rm_file(config['hlock'], sudo=False) 100 | paf.write_to_log(fname, 'Force Ended Hook Lock!', config['log']) 101 | 102 | 103 | def hlock_check(config): 104 | ''' 105 | If config['hlock'] exists it checks if it was created less than 106 | the number of seconds defined by config['hook_cooldown']. 107 | ''' 108 | fname = 'session.hlock_check(' + str(config['hook_cooldown']) + ')' 109 | if os.path.exists(config['hlock']): 110 | f = str(open(config['hlock'], 'r').readlines())[11:-4] 111 | f = f.split(':') 112 | hc = dt.datetime(int(f[0]), int(f[1]), int(f[2]), int(f[3]), int(f[4]), int(f[5])) 113 | sec_dif = (dt.datetime.now() - hc).total_seconds() 114 | 115 | if sec_dif > config['hook_cooldown']: 116 | paf.write_to_log(fname, 'Passed Cooldown Check', config['log']) 117 | else: 118 | abort(fname, 'A Hook Lock Was Created ' + str(sec_dif) + ' Ago!', 119 | 'Aborting: A Snapshot Was Created Less Than ' + str(config['hook_cooldown']) + ' Seconds Ago!', config) 120 | else: 121 | paf.write_to_log(fname, 'Passed Check, No Previous Lock Found', config['log']) 122 | 123 | 124 | ########################## 125 | # Load Config User File 126 | ######################## 127 | 128 | def load_config(): 129 | ''' 130 | Loads in user config options from /etc/pacback.conf and returns results 131 | ''' 132 | mandatory = ['hook_cooldown', 'max_ss', 'reboot'] 133 | optional = ['old_rp', 'keep_versions', 'reboot_offset', 'log_length', 'basepath', 'rp_paths', 'ss_paths'] 134 | default = { 135 | 'version': '2.1.0', 136 | 'paf': '412fd69', 137 | 'log': '/var/log/pacback.log', 138 | 'slock': '/var/lib/pacback/pacback_session.lck', 139 | 'hlock': '/var/lib/pacback/pacback_hook.lck', 140 | 'basepath': '/var/lib/pacback', 141 | 'rp_paths': '/var/lib/pacback/restore-points', 142 | 'ss_paths': '/var/lib/pacback/snapshots', 143 | 'hook_cooldown': 300, 144 | 'max_ss': 25, 145 | 'log_length': 0, # 0 Sets the Log to Infinite 146 | 'keep_versions': 3, 147 | 'old_rp': 180, 148 | 'reboot': True, 149 | 'reboot_offset': 5 150 | } 151 | 152 | if os.path.exists('/etc/pacback.conf'): 153 | user_config = paf.read_config('/etc/pacback.conf', mandatory, optional) 154 | default.update(user_config) 155 | 156 | return default 157 | -------------------------------------------------------------------------------- /core/user.py: -------------------------------------------------------------------------------- 1 | #! /usr/bin/env python3 2 | import os 3 | import datetime as dt 4 | from rich.table import Table 5 | from rich.console import Console 6 | 7 | # Local Modules 8 | import paf 9 | import meta 10 | import utils 11 | 12 | 13 | ######################### 14 | # Remove Restore Point 15 | ####################### 16 | 17 | def remove_rp(config, num, nc): 18 | fname = 'user.remove_rp(' + str(num) + ')' 19 | 20 | rm_info = { 21 | 'id': str(num).zfill(2), 22 | 'type': 'rp', 23 | 'TYPE': 'Restore Point', 24 | 'meta': config['rp_paths'] + '/rp' + str(num).zfill(2) + '.meta', 25 | 'meta_md5': config['rp_paths'] + '/.rp' + str(num).zfill(2) + '.md5', 26 | 'path': config['rp_paths'] + '/rp' + str(num).zfill(2) 27 | } 28 | 29 | if nc is False: 30 | if paf.yn_frame('Are You Sure You Want to Remove This Restore Point?') is False or None: 31 | return 32 | 33 | utils.remove_id(config, rm_info) 34 | paf.prSuccess('Restore Point Removed!') 35 | paf.write_to_log(fname, 'Removed Restore Point ' + num, config['log']) 36 | 37 | 38 | ######################################### 39 | # List All Snapshot and Restore Points 40 | ####################################### 41 | 42 | def list_all(config): 43 | ''' 44 | This presents all the currently created restore points and snapshots. 45 | ''' 46 | rps = sorted(m for m in paf.scan_dir(config['rp_paths'])[0] if m.endswith('.meta')) 47 | sss = sorted(m for m in paf.scan_dir(config['ss_paths'])[0] if m.endswith('.meta')) 48 | 49 | # Get Restore Point Data 50 | rps_data = list() 51 | for m in rps: 52 | num = m[-7] + m[-6] 53 | d = meta.read(config, m) 54 | rps_data.append(num + ' - Pkgs: ' + d['pkgs_installed'] + ' Created: ' + d['date']) 55 | if not rps_data: 56 | rps_data.append('NONE') 57 | 58 | # Get Snapshot Data 59 | sss_data = list() 60 | for m in sss: 61 | num = m[-7] + m[-6] 62 | d = meta.read(config, m) 63 | sss_data.append(num + ' - Pkgs: ' + d['pkgs_installed'] + ' Created: ' + d['date']) 64 | if not sss_data: 65 | sss_data.append('NONE') 66 | 67 | # Build Table 68 | t = Table(title='Pacback Restore Points and Snapshots') 69 | t.add_column('Restore Points', justify='left', style='white', no_wrap=True) 70 | t.add_column('Snapshots', justify='right', style='blue', no_wrap=True) 71 | 72 | # This Builds The Table Output Line by Line 73 | counter = 0 74 | for x in range(0, max(len(l) for l in [rps_data, sss_data])): 75 | try: 76 | a = str(rps_data[counter]) 77 | except Exception: 78 | a = '' 79 | 80 | try: 81 | b = str(sss_data[counter]) 82 | except Exception: 83 | b = '' 84 | 85 | t.add_row(a, b) 86 | counter += 1 87 | 88 | console = Console() 89 | console.print(t) 90 | 91 | 92 | ######################## 93 | # Print Info About ID 94 | ###################### 95 | 96 | def print_info(config, selction): 97 | ''' 98 | This function processes a meta data file without validating it, 99 | then compares the file to now and presents the results in a table. 100 | This acts as a 'dry run' of sorts not only showing info in the meta data 101 | file but also showing what would be changed if actually restored. 102 | The code is kind of gross but I'm not inclined to fix it. 103 | ''' 104 | # Build Base Vars 105 | m_num = selction[2:].zfill(2) 106 | 107 | if selction.startswith('rp'): 108 | m_path = config['rp_paths'] + '/rp' + m_num + '.meta' 109 | elif selction.startswith('ss'): 110 | m_path = config['ss_paths'] + '/ss' + m_num + '.meta' 111 | 112 | # Return if Missing 113 | if not os.path.exists(m_path): 114 | return paf.prError(selction.upper() + ' Was NOT Found!') 115 | 116 | # Load Meta and Compare 117 | m = meta.read(config, m_path) 118 | compare = meta.compare_now(config, m) 119 | 120 | # Build Data For Table 121 | c1 = [ 122 | 'Installed Packages: ' + m['pkgs_installed'], 123 | 'Date: ' + m['date'], 124 | 'Time: ' + m['time'], 125 | 'Pacback Version: ' + m['version'], 126 | 'User Label: ' + m['label'] 127 | ] 128 | 129 | if m['stype'] == 'Full': 130 | c1.append('Packages Cached: ' + m['pkgs_cached']) 131 | c1.append('Cache Size: ' + m['cache_size']) 132 | 133 | if m['dir_list']: 134 | c1.append('') 135 | c1.append('File Count: ' + m['file_count']) 136 | c1.append('Raw File Size: ' + m['file_raw_size']) 137 | c1.append('Compressed Size: ' + m['tar_size']) 138 | c1.append('') 139 | c1.append('Directory List') 140 | c1.append('--------------') 141 | for d in m['dir_list']: 142 | c1.append(d) 143 | 144 | c2 = list(compare['c_pkgs']) 145 | if not c2: 146 | c2.append('NONE') 147 | 148 | c3 = list(compare['a_pkgs']) 149 | if not c3: 150 | c3.append('NONE') 151 | 152 | c4 = list(compare['r_pkgs']) 153 | if not c4: 154 | c4.append('NONE') 155 | 156 | # Build Table 157 | t = Table(title=m['type'] + ' #' + m_num) 158 | t.add_column('Meta Info', justify='left', style='bold white', no_wrap=True) 159 | t.add_column('Changed Since Creation', justify='center', style='yellow', no_wrap=True) 160 | t.add_column('Added Since Creation', justify='center', style='green', no_wrap=True) 161 | t.add_column('Removed Since Creation', justify='center', style='red', no_wrap=True) 162 | 163 | # This Builds The Table Output Line by Line 164 | counter = 0 165 | for x in range(0, max(len(l) for l in [c1, c2, c3, c4])): 166 | try: 167 | a = str(c1[counter]) 168 | except Exception: 169 | a = '' 170 | 171 | try: 172 | b = str(c2[counter]) 173 | except Exception: 174 | b = '' 175 | 176 | try: 177 | c = str(c3[counter]) 178 | except Exception: 179 | c = '' 180 | 181 | try: 182 | d = str(c4[counter]) 183 | except Exception: 184 | d = '' 185 | 186 | t.add_row(a, b, c, d) 187 | counter += 1 188 | 189 | console = Console() 190 | console.print(t) 191 | 192 | 193 | #################################### 194 | # Compare Two IDs Defined By User 195 | ################################## 196 | 197 | def diff_meta(config, meta1, meta2): 198 | ''' 199 | This function processes two meta data files without validating either. 200 | It will compare meta1 as base compared to meta2 then present the results in a table. 201 | The code is kind of gross but I'm not inclined to fix it. 202 | ''' 203 | # Build Base Vars 204 | m1_num = meta1[2:].zfill(2) 205 | m2_num = meta2[2:].zfill(2) 206 | 207 | if meta1.startswith('rp'): 208 | m1_path = config['rp_paths'] + '/rp' + m1_num + '.meta' 209 | elif meta1.startswith('ss'): 210 | m1_path = config['ss_paths'] + '/ss' + m1_num + '.meta' 211 | 212 | if meta2.startswith('rp'): 213 | m2_path = config['rp_paths'] + '/rp' + m2_num + '.meta' 214 | elif meta2.startswith('ss'): 215 | m2_path = config['ss_paths'] + '/ss' + m2_num + '.meta' 216 | 217 | # Return if Missing 218 | if not os.path.exists(m1_path): 219 | return paf.prError(meta1.upper() + ' Was NOT Found!') 220 | 221 | if not os.path.exists(m2_path): 222 | return paf.prError(meta2.upper() + ' Was NOT Found!') 223 | 224 | # Read Meta Data 225 | m1 = meta.read(config, m1_path) 226 | m2 = meta.read(config, m2_path) 227 | compare = meta.compare_meta(config, m1, m2) 228 | 229 | # Build Info For Table 230 | c1 = [ 231 | 'Installed Packages: ' + m1['pkgs_installed'], 232 | 'Date: ' + m1['date'], 233 | 'Time: ' + m1['time'], 234 | 'Pacback Version: ' + m1['version'], 235 | 'User Label: ' + m1['label'] 236 | ] 237 | 238 | if m1['stype'] == 'Full': 239 | c1.append('Packages Cached: ' + m1['pkgs_cached']) 240 | c1.append('Cache Size: ' + m1['cache_size']) 241 | 242 | if m1['dir_list']: 243 | c1.append('') 244 | c1.append('File Count: ' + m1['file_count']) 245 | c1.append('Raw File Size: ' + m1['file_raw_size']) 246 | c1.append('Compressed Size: ' + m1['tar_size']) 247 | c1.append('') 248 | c1.append('Directory List') 249 | c1.append('--------------') 250 | for d in m1['dir_list']: 251 | c1.append(d) 252 | 253 | c2 = list(compare['c_pkgs']) 254 | if not c2: 255 | c2.append('NONE') 256 | 257 | c3 = list(compare['a_pkgs']) 258 | if not c3: 259 | c3.append('NONE') 260 | 261 | c4 = list(compare['r_pkgs']) 262 | if not c4: 263 | c4.append('NONE') 264 | 265 | c5 = ['Installed Packages: ' + m2['pkgs_installed'], 266 | 'Date: ' + m2['date'], 267 | 'Time: ' + m2['time'], 268 | 'Pacback Version: ' + m2['version'], 269 | 'User Label: ' + m2['label']] 270 | 271 | if m2['stype'] == 'Full': 272 | c5.append('Packages Cached: ' + m2['pkgs_cached']) 273 | c5.append('Cache Size: ' + m2['cache_size']) 274 | 275 | if m2['dir_list']: 276 | c5.append('') 277 | c5.append('File Count: ' + m2['file_count']) 278 | c5.append('Raw File Size: ' + m2['file_raw_size']) 279 | c5.append('Compressed Size: ' + m2['tar_size']) 280 | c5.append('') 281 | c5.append('Directory List') 282 | c5.append('--------------') 283 | for d in m2['dir_list']: 284 | c5.append(d) 285 | 286 | # Build Table 287 | t = Table(title=m1['type'] + ' #' + m1_num + ' --------> ' + m2['type'] + ' #' + m2_num) 288 | t.add_column(meta1.upper() + ' Meta Info', justify='left', style='bold white', no_wrap=True) 289 | t.add_column('Changed Since Creation', justify='center', style='yellow', no_wrap=True) 290 | t.add_column('Added Since Creation', justify='center', style='green', no_wrap=True) 291 | t.add_column('Removed Since Creation', justify='center', style='red', no_wrap=True) 292 | t.add_column(meta2.upper() + ' Meta Info', justify='right', style='bold white', no_wrap=True) 293 | 294 | # This Builds The Table Output Line by Line 295 | counter = 0 296 | for x in range(0, max(len(l) for l in [c1, c2, c3, c4, c5])): 297 | try: 298 | a = str(c5[counter]) 299 | except Exception: 300 | a = '' 301 | 302 | try: 303 | b = str(c2[counter]) 304 | except Exception: 305 | b = '' 306 | 307 | try: 308 | c = str(c3[counter]) 309 | except Exception: 310 | c = '' 311 | 312 | try: 313 | d = str(c4[counter]) 314 | except Exception: 315 | d = '' 316 | 317 | try: 318 | e = str(c5[counter]) 319 | except Exception: 320 | e = '' 321 | 322 | t.add_row(a, b, c, d, e) 323 | counter += 1 324 | 325 | console = Console() 326 | console.print(t) 327 | 328 | 329 | ########################## 330 | # Better Cache Cleaning 331 | ######################## 332 | 333 | def clean_cache(config, nc): 334 | ''' 335 | This provides automated cache cleaning using pacman, paccache, and pacback. 336 | ''' 337 | fname = 'utils.clean_cache()' 338 | paf.prBold('Starting Advanced Cache Cleaning...') 339 | paf.write_to_log(fname, 'Starting Advanced Cache Cleaning...', config['log']) 340 | print('') 341 | 342 | if nc is True or paf.yn_frame('Do You Want To Uninstall Orphaned Packages?') is True: 343 | os.system('/usr/bin/pacman -R $(/usr/bin/pacman -Qtdq)') 344 | paf.write_to_log(fname, 'Removed Orphaned Packages', config['log']) 345 | 346 | if nc is True or paf.yn_frame('Do You Want To Remove Old Versions of Installed Packages?') is True: 347 | os.system('/usr/bin/paccache -rk ' + str(config['keep_versions'])) 348 | paf.write_to_log(fname, 'Removed Old Package Versions', config['log']) 349 | 350 | if nc is True or paf.yn_frame('Do You Want To Remove Cached Orphans?') is True: 351 | os.system('/usr/bin/paccache -ruk0') 352 | paf.write_to_log(fname, 'Removed Cached Orphans', config['log']) 353 | 354 | if nc is True or paf.yn_frame('Do You Want To Check For Old Pacback Restore Points?') is True: 355 | paf.write_to_log(fname, 'Starting Search For Old Restore Points...', config['log']) 356 | meta_paths = sorted(f for f in paf.find_files(config['rp_paths']) if f.endswith(".meta")) 357 | 358 | today = dt.datetime.now().strftime("%Y/%m/%d") 359 | t_split = (today.split('/')) 360 | today_dt = dt.date(int(t_split[0]), int(t_split[1]), int(t_split[2])) 361 | 362 | for m in meta_paths: 363 | rp_info = { 364 | 'id': m[-7] + m[-6], 365 | 'type': 'rp', 366 | 'TYPE': 'Restore Point', 367 | 'meta': m, 368 | 'meta_md5': config['rp_paths'] + '/.rp' + m[-7] + m[-6] + '.md5', 369 | 'path': config['rp_paths'] + '/rp' + m[-7] + m[-6], 370 | 'pkgcache': config['rp_paths'] + '/rp' + m[-7] + m[-6] + '/pkg-cache'} 371 | 372 | # Format Dates for Compare 373 | m_dict = meta.read(config, m) 374 | o_split = (m_dict['date'].split('/')) 375 | old_dt = dt.date(int(o_split[0]), int(o_split[1]), int(o_split[2])) 376 | 377 | # Check How Old Restore Point Is 378 | days = (today_dt - old_dt).days 379 | if days > config['old_rp']: 380 | paf.prWarning('Failed: ' + rp_info['TYPE'] + ' ' + rp_info['id'] + ' Is ' + str(days) + ' Days Old!') 381 | paf.write_to_log(fname, rp_info['TYPE'] + ' ' + rp_info['id'] + ' Is ' + str(days) + ' Days Old!', config['log']) 382 | if paf.yn_frame('Do You Want to Remove This ' + rp_info['TYPE'] + '?') is True: 383 | utils.remove_id(config, rp_info) 384 | paf.prSuccess('Restore Point Removed!') 385 | else: 386 | paf.write_to_log(fname, 'User Declined Removal of ' + rp_info['TYPE'] + ' ' + rp_info['id'], config['log']) 387 | 388 | else: 389 | paf.prSuccess('Passed: ' + rp_info['TYPE'] + ' ' + rp_info['id'] + ' Is ' + str(days) + ' Days Old') 390 | paf.write_to_log(fname, rp_info['TYPE'] + ' ' + rp_info['id'] + ' Is ' + str(days) + ' Days Old', config['log']) 391 | 392 | paf.write_to_log(fname, 'Finished Advanced Cache Cleaning', config['log']) 393 | 394 | 395 | # def ss_timeline(config, num1, num2): 396 | # ''' 397 | # In the future I would like to add something that looks like this. 398 | # ''' 399 | # --------------------- NOW ----------------------- 400 | # Time: -xxx days | Packages: xxxx | Kernel: xx.xx.xx 401 | # --------------------------------------------------- 402 | # ^ ^ ^ 403 | # | | | 404 | # Changed: xxx Removed: xxx Added: xxx 405 | # | | | 406 | # --------------------- SS 00 ----------------------- 407 | # Time: -xxx days | Packages: xxxx | Kernel: xx.xx.xx 408 | # --------------------------------------------------- 409 | # ^ ^ ^ 410 | # | | | 411 | # Changed: xxx Removed: xxx Added: xxx 412 | # | | | 413 | # --------------------- SS 01 ----------------------- 414 | # Time: -xxx days | Packages: xxxx | Kernel: xx.xx.xx 415 | # --------------------------------------------------- 416 | # ^ ^ ^ 417 | # | | | 418 | # Changed: xxx Removed: xxx Added: xxx 419 | # | | | 420 | # --------------------- SS 02 ----------------------- 421 | # Time: -xxx days | Packages: xxxx | Kernel: xx.xx.xx 422 | # --------------------------------------------------- 423 | # ^ ^ ^ 424 | # | | | 425 | # Changed: xxx Removed: xxx Added: xxx 426 | # | | | 427 | # --------------------- SS 03 ----------------------- 428 | # Time: -xxx days | Packages: xxxx | Kernel: xx.xx.xx 429 | # --------------------------------------------------- 430 | -------------------------------------------------------------------------------- /core/utils.py: -------------------------------------------------------------------------------- 1 | #! /usr/bin/env python3 2 | import re 3 | import os 4 | import stat 5 | import itertools 6 | import subprocess 7 | import multiprocessing as mp 8 | 9 | # Local Modules 10 | import paf 11 | 12 | 13 | ############################## 14 | # Utils For Other Functions 15 | ############################ 16 | 17 | def remove_id(config, info): 18 | '''Remove a selected id based on type.''' 19 | fname = "utils.remove_id(" + info['type'] + info['id'] + ")" 20 | 21 | paf.rm_file(info['meta'], sudo=False) 22 | paf.rm_file(info['meta_md5'], sudo=False) 23 | if info['type'] == 'rp': 24 | paf.rm_dir(info['path'], sudo=False) 25 | 26 | paf.write_to_log(fname, 'Removal Complete', config['log']) 27 | 28 | 29 | def find_pkgs_in_dir(path): 30 | ''' 31 | Scans a target directory for files ending 32 | in the `.pkg.tar.zst` and `.pkg.tar.xz` extensions. 33 | ''' 34 | cache = {f for f in paf.find_files(path) 35 | if f.endswith(".pkg.tar.xz") or f.endswith(".pkg.tar.zst")} 36 | return cache 37 | 38 | 39 | def first_pkg_path(pkgs, fs_list): 40 | ''' 41 | This is a throwaway function for processing paths in chunks. 42 | This scans a set of paths for a file, adding only the first result. 43 | ''' 44 | paths = list() 45 | for pkg in pkgs: 46 | for f in fs_list: 47 | if f.split('/')[-1] == pkg: 48 | paths.append(f) 49 | break 50 | return paths 51 | 52 | 53 | def trim_pkg_list(pkg_list): 54 | ''' 55 | Removes prefix dir and x86_64.pkg.tar.zsd suffix. 56 | This seems to be the fastest way too reduce all file paths to a unique 57 | list of package versions present on the system. 58 | ''' 59 | return {'-'.join(pkg.split('-')[:-1]) for pkg in paf.basenames(pkg_list)} 60 | 61 | 62 | def search_pkg_chunk(search, fs_list): 63 | ''' 64 | This is a throwaway function for processing pkgs in chunks. 65 | This takes a search term and returns the files that match. 66 | For reasons I don't totally understand, this function must 67 | exist outside of scan_caches() for local objects to be pickled. 68 | ''' 69 | pkgs = list() 70 | for f in fs_list: 71 | if re.search(search, f.lower()): 72 | pkgs.append(f) 73 | return pkgs 74 | 75 | 76 | def user_pkg_search(search_pkg, cache): 77 | ''' 78 | Provides more accurate searches for single pkg names without a version. 79 | ''' 80 | pkgs = trim_pkg_list(cache) 81 | found = set() 82 | 83 | for p in pkgs: 84 | r = re.split("\d+-\d+|\d+(?:\.\d+)+|\d:\d+(?:\.\d+)+", p)[0] 85 | if r.strip()[-1] == '-': 86 | r = r.strip()[:-1] 87 | if re.fullmatch(re.escape(search_pkg.lower().strip()), r): 88 | found.add(p) 89 | 90 | if not found: 91 | paf.prWarning('No Packages Found! Extending Regex Search...') 92 | for p in pkgs: 93 | if re.findall(re.escape(search_pkg.lower().strip()), p): 94 | found.add(p) 95 | 96 | return found 97 | 98 | 99 | def fetch_new_mirrorlist(): 100 | ''' 101 | Allows a user to fetch a new Arch Linux mirrorlist directly. 102 | Returns True if the download ran successfully and False if not. 103 | ''' 104 | countries = ['all', 'US', 'CA', 'GB', 'AU', 'FR', 'DE', 'NL', 'JP', 'KR'] 105 | country = paf.multi_choice_frame(countries) 106 | if country is not False: 107 | url = 'https://www.archlinux.org/mirrorlist/?country=' + country + '&protocol=https&use_mirror_status=on' 108 | paf.download_url(url, '/etc/pacman.d/mirrorlist') 109 | paf.sed_uncomment_line('Server', '/etc/pacman.d/mirrorlist', sudo=False) 110 | return True 111 | else: 112 | return False 113 | 114 | 115 | ############################ 116 | # Package And Cache Utils 117 | ########################## 118 | 119 | def find_cache_paths(config): 120 | ''' 121 | Fetch a list of every normal users configured homepath 122 | ''' 123 | paths = list() 124 | for line in paf.read_file('/etc/pacman.conf'): 125 | if line.startswith('CacheDir'): 126 | paths.append(line.split('=')[-1].strip()) 127 | 128 | if not paths: 129 | paths.append('/var/cache/pacman/pkg') 130 | 131 | for u in paf.list_normal_users(): 132 | if os.path.exists(u[5] + '/.cache'): 133 | paths.append(u[5] + '/.cache') 134 | 135 | paths.append(config['basepath']) 136 | 137 | return paths 138 | 139 | 140 | def pacman_Q(): 141 | ''' 142 | Captures the output of `pacman -Q` from stdout 143 | ''' 144 | raw = subprocess.Popen('/usr/bin/pacman -Q', stdout=subprocess.PIPE, shell=True) 145 | out = str(raw.communicate())[3:] 146 | out = out.split('\n') 147 | out = set(out[0].split('\\n')[:-1]) 148 | return out 149 | 150 | 151 | def scan_caches(config): 152 | ''' 153 | Always returns a unique list of pkgs found on the file sys. 154 | When searching through rp directories, many 'duplicate' hardlinked files exist. 155 | This logic ensures that the list of packages returned is actually unique. 156 | ''' 157 | fname = 'utils.scan_caches()' 158 | paf.write_to_log(fname, 'Started Scaning Directories for Packages...', config['log']) 159 | 160 | # Searches Known Package Cache Locations 161 | pkg_paths = find_pkgs_in_dir(find_cache_paths(config)) 162 | unique_pkgs = list(paf.basenames(pkg_paths)) 163 | paf.write_to_log(fname, 'Searched ALL Package Cache Locations', config['log']) 164 | 165 | # Branch If Filter Is Needed 166 | if len(pkg_paths) != len(unique_pkgs): 167 | # Find Unique Packages By Inode Number 168 | inodes = set() 169 | inode_filter = set() 170 | 171 | for x in pkg_paths: 172 | i = os.lstat(x)[stat.ST_INO] 173 | if i in inodes: 174 | pass 175 | else: 176 | inode_filter.add(x) 177 | inodes.add(i) 178 | 179 | paf.write_to_log(fname, 'Found ' + str(len(inode_filter)) + ' Package Inode\'s!', config['log']) 180 | 181 | if len(inode_filter) != len(unique_pkgs): 182 | # THIS SHOULD BASICALLY NEVER RUN 183 | paf.write_to_log(fname, 'File System Contains None-Hardlinked Duplicate Packages!', config['log']) 184 | paf.write_to_log(fname, 'Attempting to Filter Packages With Regex...', config['log']) 185 | thread_cap = 4 186 | 187 | # This Chunks the List of unique_pkgs Into Peices 188 | chunk_size = int(round(len(unique_pkgs) / paf.max_threads(thread_cap), 0)) + 1 189 | chunks = [unique_pkgs[i:i + chunk_size] for i in range(0, len(unique_pkgs), chunk_size)] 190 | 191 | # Creates Pool of Threads to Filter Based on File Name 192 | with mp.Pool(processes=paf.max_threads(thread_cap)) as pool: 193 | filter_fs = pool.starmap(first_pkg_path, zip(chunks, itertools.repeat(inode_filter))) 194 | filter_fs = set(itertools.chain(*filter_fs)) 195 | 196 | else: 197 | filter_fs = inode_filter 198 | 199 | paf.write_to_log(fname, 'Returned ' + str(len(filter_fs)) + ' Unique Cache Packages', config['log']) 200 | return filter_fs 201 | 202 | else: 203 | paf.write_to_log(fname, 'Returned ' + str(len(pkg_paths)) + ' Cached Packages', config['log']) 204 | return pkg_paths 205 | 206 | 207 | def search_cache(pkg_list, fs_list, config): 208 | ''' 209 | Searches the cache for matching pkg versions and returns the results. 210 | Because of the way files are named and the output given by pacman -Q, 211 | regex is needed to find the version in the cached package path. 212 | No performance is gained with more than 4 threads on this function. 213 | ''' 214 | fname = 'utils.search_cache(' + str(len(pkg_list)) + ')' 215 | thread_cap = 4 216 | 217 | # Combing Package Names Into One Term Provides Much Faster Results 218 | paf.write_to_log(fname, 'Started Search for Matching Versions...', config['log']) 219 | bulk_search = ('|'.join(list(re.escape(pkg) for pkg in pkg_list))) 220 | 221 | # Chunks List of Searches Into Peices For Multi-Threaded Search 222 | chunk_size = int(round(len(fs_list) / paf.max_threads(thread_cap), 0)) + 1 223 | fs_list = list(f for f in fs_list) 224 | chunks = [fs_list[i:i + chunk_size] for i in range(0, len(fs_list), chunk_size)] 225 | 226 | # Creates Pool of Threads to Run Regex Searches 227 | with mp.Pool(processes=paf.max_threads(thread_cap)) as pool: 228 | found_pkgs = pool.starmap(search_pkg_chunk, zip(itertools.repeat(bulk_search), chunks)) 229 | found_pkgs = set(itertools.chain(*found_pkgs)) 230 | 231 | paf.write_to_log(fname, 'Found ' + str(len(found_pkgs)) + ' OUT OF ' + str(len(pkg_list)) + ' Packages', config['log']) 232 | return found_pkgs 233 | 234 | 235 | ################################ 236 | # Pacman Pre-Transaction Hook 237 | ############################## 238 | 239 | def pacman_hook(install, config): 240 | ''' 241 | Installs or removes a standard alpm hook in /usr/share/libalpm/hooks/ 242 | which runs as a PreTransaction hook during every pacman transaction. 243 | `install = True` Installs Pacman Hook 244 | `install = False` Removes Pacman Hook 245 | ''' 246 | 247 | if install is True: 248 | fname = 'utils.pacman_hook(install)' 249 | paf.write_to_log(fname, 'Starting Hook Installation...', config['log']) 250 | 251 | hook = [ 252 | '[Trigger]', 253 | 'Operation = Install', 254 | 'Operation = Remove', 255 | 'Operation = Upgrade', 256 | 'Type = Package', 257 | 'Target = *', 258 | '', 259 | '[Action]', 260 | 'Description = Pre-Upgrade Pacback Hook', 261 | 'Depends = pacman', 262 | 'When = PreTransaction', 263 | 'Exec = /usr/bin/pacback --hook' 264 | ] 265 | 266 | paf.export_iterable('/usr/share/libalpm/hooks/pacback.hook', hook) 267 | paf.prSuccess('Pacback Hook is Now Installed!') 268 | paf.write_to_log(fname, 'Installed Pacback PreTransaction Hook', config['log']) 269 | 270 | elif install is False: 271 | fname = 'utils.pacman_hook(remove)' 272 | paf.write_to_log(fname, 'Starting Hook Removal...', config['log']) 273 | 274 | paf.rm_file('/usr/share/libalpm/hooks/pacback.hook', sudo=False) 275 | paf.write_to_log(fname, 'Removed Pacback PreTransaction Hook', config['log']) 276 | paf.prSuccess('Pacback Hook Was Removed!') 277 | 278 | 279 | ################################### 280 | # Check If Kernel Needs A Reboot 281 | ################################# 282 | 283 | def reboot_check(config): 284 | ''' 285 | Checks the running and installed kernel versions 286 | to determine if a reboot is needed. 287 | ''' 288 | fname = 'utils.reboot_check()' 289 | 290 | cmd = "file -bL /boot/vmlinuz* | grep -o 'version [^ ]*' | cut -d ' ' -f 2 && uname -r" 291 | raw = subprocess.Popen(cmd, stdout=subprocess.PIPE, shell=True) 292 | out = str(raw.communicate())[3:] 293 | out = out.split('\n') 294 | out = out[0].split('\\n')[:-1] 295 | 296 | if out[0].strip() != out[1].strip(): 297 | paf.write_to_log(fname, 'The Installed Kernel Has Changed From ' + out[1].strip() + ' To ' + out[0].strip(), config['log']) 298 | paf.prWarning('Your Installed Kernel Has Changed From ' + out[1].strip() + ' To ' + out[0].strip()) 299 | 300 | if config['reboot'] is True: 301 | if paf.yn_frame('Do You Want To Schedule A Reboot In ' + str(config['reboot_offset']) + ' Minutes?') is True: 302 | os.system("shutdown -r $(date --date='" + str(config['reboot_offset']) + " minute' +%H:%M)") 303 | paf.write_to_log(fname, 'User Scheduled A Reboot In ' + str(config['reboot_offset']) + ' Minutes', config['log']) 304 | else: 305 | paf.write_to_log(fname, 'User Declined System Reboot', config['log']) 306 | else: 307 | paf.write_to_log(fname, 'A Reboot Is Needed For The Whole Downgrade To Take Affect!', config['log']) 308 | 309 | else: 310 | paf.write_to_log(fname, 'The Kernel Hasn\'t Been Changed, A Reboot is Unnecessary', config['log']) 311 | 312 | 313 | ################################ 314 | # Get Size of Cached Packages 315 | ############################## 316 | 317 | def cache_size(config): 318 | ''' 319 | Gets the size of cached packages reported by applications like du, 320 | and also the real size without counting hardlinks more than once. 321 | ''' 322 | fname = 'utils.cache_size()' 323 | paf.write_to_log(fname, 'Started Calculating Cache Size...', config['log']) 324 | 325 | caches = find_cache_paths(config) 326 | pacman_cache = find_pkgs_in_dir(caches[0]) 327 | user_cache = find_pkgs_in_dir(caches[1:-1]) 328 | pacback_cache = find_pkgs_in_dir(caches[-1:]) 329 | 330 | inodes = {os.lstat(x)[stat.ST_INO] for x in {*pacman_cache, *user_cache}} 331 | pacback_filter = set() 332 | 333 | for x in pacback_cache: 334 | i = os.lstat(x)[stat.ST_INO] 335 | if i in inodes: 336 | pass 337 | else: 338 | pacback_filter.add(x) 339 | inodes.add(i) 340 | 341 | all_cache = {*pacman_cache, *user_cache, *pacback_cache} 342 | pkg_total = len(pacman_cache) + len(user_cache) + len(pacback_filter) 343 | 344 | # Calculate Size On Disk 345 | pacman_size = paf.convert_size(paf.size_of_files(pacman_cache)) 346 | user_size = paf.convert_size(paf.size_of_files(user_cache)) 347 | pacback_size = paf.convert_size(paf.size_of_files(pacback_filter)) 348 | reported_size = paf.convert_size(paf.size_of_files(all_cache)) 349 | paf.write_to_log(fname, 'Returning Cache Size', config['log']) 350 | 351 | return (str(pkg_total), pacman_size, user_size, pacback_size, reported_size) 352 | -------------------------------------------------------------------------------- /core/version.py: -------------------------------------------------------------------------------- 1 | #! /usr/bin/env python3 2 | 3 | # Local Modules 4 | import paf 5 | import session 6 | 7 | 8 | def compare(config, target_version): 9 | ''' 10 | Parses the versions and forks if an upgrade is needed. 11 | ''' 12 | fname = 'version.compare()' 13 | 14 | # Current Version 15 | cv_M = int(config['version'].split('.')[0]) 16 | cv_m = int(config['version'].split('.')[1]) 17 | cv_p = int(config['version'].split('.')[2]) 18 | 19 | # Target Version 20 | tv_M = int(target_version.split('.')[0]) 21 | tv_m = int(target_version.split('.')[1]) 22 | tv_p = int(target_version.split('.')[2]) 23 | 24 | versions = ((cv_M, cv_m, cv_p), (tv_M, tv_m, tv_p)) 25 | 26 | if config['version'] != target_version: 27 | paf.write_to_log(fname, 'Current Version ' + config['version'] + ' Miss-Matched With ' + target_version, config['log']) 28 | 29 | # Check for Versions <= V1.5 30 | if tv_M == 1 and tv_m < 5: 31 | paf.prError('Restore Points Generated Before V1.5.0 Are Not Backwards Compatible With Newer Versions of Pacback!') 32 | paf.write_to_log(fname, 'Detected a Restore Point Version Generated > V1.5', config['log']) 33 | session.abort_fail(fname, 'Can\'t Upgrade or Restore Versions Created Before V1.5', 34 | 'Aborting!', config['log']) 35 | 36 | # Check for V1.5 to V1.7 37 | elif tv_M == 1 and tv_m > 5: 38 | paf.write_to_log(fname, 'Detected Alpha Restore Point!', config['log']) 39 | 40 | else: 41 | paf.write_to_log(fname, 'Both Versions Match ' + config['version'], config['log']) 42 | 43 | return versions 44 | --------------------------------------------------------------------------------