Scanning all new published packages on PyPI I know that the quality is often quite bad. I try to filter out the worst ones and list here the ones which might be worth a look, being followed or inspire you in some way.

lttb
Largest-Triangle-Three-Buckets algorithm for downsampling time series-like data. Numpy implementation of Steinarsson’s Largest-Triangle-Three-Buckets algorithm for downsampling time series-like data.

neuralnetwork
Artificial Neural Network. The library allows you to build and train multi-layer neural networks. You first define the structure for the network. The number of input, output, layers and hidden nodes. The network is then constructed. Interconnection strengths are represented using an adjacency matrix and initialised to small random values. Traning data is then presented to the network incrementally. The neural network uses an online backpropagation training algorithm that uses gradient descent to descend the error curve to adjust interconnection strengths. The aim of the training algorithm is to adjust the interconnection strengths in order to reduce the global error. The global error for the network is calculated using the mean sqaured error. You can provide a learning rate and momentum parameter. The learning rate will affect the speed at which the neural network converges to an optimal solution. The momentum parameter will help gradient descent to avoid converging to a non optimal solution on the error curve called local minima. The correct size for the momentum parameter will help to find the global minima but too large a value will prevent the neural network from ever converging to a solution. Trained neural networks can be saved to file and loaded back for later activation.

pytorch-nce
An NCE implementation in pytorch. Noise Contrastive Estimation (NCE) is an approximation method that is used to work around the huge computational cost of large softmax layer. The basic idea is to convert the prediction problem into classification problem at training stage. It has been proved that these two criterions converges to the same minimal point as long as noise distribution is close enough to real one. NCE bridges the gap between generative models and discriminative models, rather than simply speedup the softmax layer. With NCE, you can turn almost anything into posterior with less effort (I think).

tensorflow-gcs-config
TensorFlow operations for configuring access to GCS (Google Compute Storage) resources.

utensor_cgen
C code generation program for uTensor

bimlpa
Community detection in bipartite networks using multi-label propagation algorithm

coltrane
Just another ML framework. Built on top of scikit-learn. General use, **pipeline-oriented** machine learning framework. Lets user configure pipelines, load data, and evaluate pipeline against data. Who’s said that improvising over `Giant Steps` has to be hard? This framework eases and standardizes research process. User can focus on configuring pipeline, or implementing core pipeline elements.

ctgan
Conditional GAN for Tabular Data

dagmar
A DAG-based framework for robust road estimation. This project provide the software for the algorithms and concepts for enriching real-world test drive data collected within the research project [FASva](https://…/fasva ) with information of the road infrastructure as basis for context-aware scenario mining For that purpose, the work proposes an advances map matching algorithm that is based on particle filter for vehicle localization employing a directed acyclic graph of mapped roads (*DAGMaR*) to robustly estimate the trajectory of most likely vehicle poses.

genetic-algorithm
A python package implementing the genetic algorithm

Advertisements