Skip to content

Commit

Permalink
chore: Update dependencies and versions in project files
Browse files Browse the repository at this point in the history
  • Loading branch information
titusz committed Oct 11, 2024
2 parents 741fee9 + f164560 commit badab5a
Show file tree
Hide file tree
Showing 6 changed files with 657 additions and 980 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ jobs:
strategy:
fail-fast: false
matrix:
python-version: [3.9, '3.10', '3.11', '3.12']
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-latest, macos-13, windows-latest]
runs-on: ${{ matrix.os }}
steps:
Expand Down
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -8,3 +8,4 @@ __pycache__
coverage.xml
.env
.aider*
datasets
2 changes: 1 addition & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Changelog

## [0.1.3] - Unrelease
## [0.1.3] - Unreleased

## [0.1.2] - 2024-08-19
- Encode granular features with base64
Expand Down
26 changes: 26 additions & 0 deletions CONVENTIONS.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
# Coding Convetions

- Prefer httpx over requests for making http requests!
- Write pragmatic, easily testable, and performant code!
- Prefer short and pure functions where possible!
- Keep the number of function arguments below 4!
- Don´t use nested functions!
- Write concise and to-the-point docstrings for all functions!
- Write type comments style (PEP 484) instead of function annotations (PEP 3107)
- Always add a correct PEP 484 style type comment as the first line after the function definition!
- Use built-in collection types as generic types for annotations (PEP 585)!
- Use the | (pipe) operator for writing union types (PEP 604)!

Example function definition with (PEP 484) type comment and docstring:

```python
def tokenize_chunks(chunks, max_len=None):
# type: (list[str], int|None) -> dict
"""
Tokenize text chunks into model-compatible formats.
:param chunks: Text chunks to tokenize.
:param max_len: Truncates chunks above max_len characters
:return: Dictionary of tokenized data including input IDs, attention masks, and type IDs.
"""
```
Loading

0 comments on commit badab5a

Please sign in to comment.