Scanning all new published packages on PyPI I know that the quality is often quite bad. I try to filter out the worst ones and list here the ones which might be worth a look, being followed or inspire you in some way.

A small ‘assembler’ for ELF executables. ELFHex is a simple ‘assembler’ designed for learning machine code. It takes programs comprising machine instructions and packages them into simple 32-bit ELF executable binaries. It aims to do the minimum amount of transformation necessary to keep the output binaries understandable and easy to relate back to the source files. Nevertheless, it has several language features (beyond just constructing the ELF header) to make it more convenient than just trying to write an executable using a hex editor.

Jupyter kernels manipulation and in other environments (docker, Lmod, etc.). Sometimes, one needs to execute Jupyter kernels in a different environment. Say you want to execute the kernel in a conda environment (that’s easy, but actually misses setting certain environment variables). Or run it inside a Docker container. One could manually adjust the kernelspec files to set environment variables or run commands before starting the kernel, but envkernel automates this process.

Face recognition engine

Reusable Python code for projects

A Keras implementation, with gpu support, of the Doc2Vec network.

an SDK for educational robotics

A small package to download and parse the MNIST dataset

Parsing and storing NIF information. The NLP Interchange Format (NIF) is a RDF/OWL-based format wich allows the spotting of words from text corpora and its metada such as part-of-speech tags, knowledge-base links, entity type, etc. Likewise other Python libraries (e.g., [pynif](https://…/pynif )), this library transform NIF data to python classes in order to better proccessing this information.

A library of scikit compatible text transformers, that are ready to be integrated in an NLP pipeline for various classification tasks.

Operations Research Framework for building metaheuristic algorithms. OR-testbed is a framework designed to solve combinatorial optimization problems through the use and modeling of [metaheuristics](https://…/Metaheuristic ). Since metaheuristics are based on certain basic assumptions, they can be adapted for a multitude of different problems. The contribution of OR-Testbed is to provide the implementation of the basic structure of metaheuristics. In this way, the problems to be solved (and therefore the related structures needed) only need to be modeled once. Then, once each of the functions and parameters of the metaheuristics to be used have been adapted, the problem can be solved with all of them from a centralized point. In summary, the objective of the framework is to implement the most relevant metaheuristics in the state of the art, as well as the techniques commonly used in literature to improve them. This lets developers and researchers to have a centralized repository of implemented metaheuristics, techniques and potential execution service for all of their problems.

Data generator for hierarchically modeling strongly-lensed systems with Bayesian neural networks

Collection of personalized helpers for machine learning projects

“”SNAP (Stanford Network Analysis Platform) Python””

Term frequency inverse document frequency

Library for type-2 fuzzy logic research