When work on a large calculator project I often forget to revert changes when testing code. Bugs can easily be created this way. To mitigate this I wanted to use version control. I made a detokenizer based on the token xml files bundled with TokenIDE. With this python package you can detokenize 8xp program files for ti8x calculators. Instructions for git integration are included in the Readme of the package. Custom token sets can be added using xml files.
You can download the attached zip or find the source code over here:
Edit: Version 0.2
The basic functionality took me about an evening to write. Dealing with multiple variables per file took two full days.
- The module can now deal with files containing multiple variables.
- Resource files moved to installation directory for cross platform support.
- New dependency: ti83f
Hey, that is pretty neat. Ill use that if I can find a way to install python onto my thumb drive. I assume that you can detokenize programs to restore them to a previous version? Or are the detokenized files accompanied by the programs? If the program turns the detokenized strings back into programs, did you account for the fact that " and " and similar tokens can be interpreted multiple ways? The string " and " can be the 'and' function or it can be two space tokens and three lowercase letters. TokenIDE assumes that it is always the function so some parts of text could be converted to tokens by mistake which could cause problems in Axe (and I assume Grammer as well). Ti-BASIC would be unaffected as it uses strings of tokens not characters. But that is just a minor complaint if the problem exists at all. Nice work!
It's a one way conversion from program to plaintext. Git can use this to show differences between two programs or versions using git show/diff etc. but it wont store the detokenized program.