CodeGRU
Recently many NLP-based deep learning models have been applied to model source code for source code suggestion and recommendation tasks. A major limitation of these approaches is that they take source code as simple tokens of text and ignore its contextual, syntaxtual and structural dependencies. In this work, we present CodeGRU, a Gated Recurrent Unit based source code language model that is capable of capturing contextual, syntaxtual and structural dependencies for modeling the source code. The CodeGRU introduces the following several new components. The Code Sampler is first proposed for selecting noise-free code samples and transforms obfuscate code to its proper syntax, which helps to capture syntaxtual and structural dependencies. The Code Regularize is next introduced to encode source code which helps capture the contextual dependencies of the source code. Finally, we propose a novel method which can learn variable size context for modeling source code. We evaluated CodeGRU with real-world dataset and it shows that CodeGRU can effectively capture contextual, syntaxtual and structural dependencies which previous works fails. We also discuss and visualize two use cases of CodeGRU for source code modeling tasks (1) source code suggestion, and (2) source code generation. …

IOHprofiler
IOHprofiler is a new tool for analyzing and comparing iterative optimization heuristics. Given as input algorithms and problems written in C or Python, it provides as output a statistical evaluation of the algorithms’ performance by means of the distribution on the fixed-target running time and the fixed-budget function values. In addition, IOHprofiler also allows to track the evolution of algorithm parameters, making our tool particularly useful for the analysis, comparison, and design of (self-)adaptive algorithms. IOHprofiler is a ready-to-use software. It consists of two parts: an experimental part, which generates the running time data, and a post-processing part, which produces the summarizing comparisons and statistical evaluations. The experimental part is build on the COCO software, which has been adjusted to cope with optimization problems that are formulated as functions $f:\mathcal{S}^n \to \R$ with $\mathcal{S}$ being a discrete alphabet of integers. The post-processing part is our own work. It can be used as a stand-alone tool for the evaluation of running time data of arbitrary benchmark problems. It accepts as input files not only the output files of IOHprofiler, but also original COCO data files. The post-processing tool is designed for an interactive evaluation, allowing the user to chose the ranges and the precision of the displayed data according to his/her needs. IOHprofiler is available on GitHub at \url{https://…/IOHprofiler}.

Auxiliary Inference Divergence Estimator (AIDE)
Approximate probabilistic inference algorithms are central to many fields. Examples include sequential Monte Carlo inference in robotics, variational inference in machine learning, and Markov chain Monte Carlo inference in statistics. A key problem faced by practitioners is measuring the accuracy of an approximate inference algorithm on a specific dataset. This paper introduces the auxiliary inference divergence estimator (AIDE), an algorithm for measuring the accuracy of approximate inference algorithms. AIDE is based on the observation that inference algorithms can be treated as probabilistic models and the random variables used within the inference algorithm can be viewed as auxiliary variables. This view leads to a new estimator for the symmetric KL divergence between the output distributions of two inference algorithms. The paper illustrates application of AIDE to algorithms for inference in regression, hidden Markov, and Dirichlet process mixture models. The experiments show that AIDE captures the qualitative behavior of a broad class of inference algorithms and can detect failure modes of inference algorithms that are missed by standard heuristics. …

Hyperbolic Neural Network
Hyperbolic spaces have recently gained momentum in the context of machine learning due to their high capacity and tree-likeliness properties. However, the representational power of hyperbolic geometry is not yet on par with Euclidean geometry, mostly because of the absence of corresponding hyperbolic neural network layers. This makes it hard to use hyperbolic embeddings in downstream tasks. Here, we bridge this gap in a principled manner by combining the formalism of M\’obius gyrovector spaces with the Riemannian geometry of the Poincar\’e model of hyperbolic spaces. As a result, we derive hyperbolic versions of important deep learning tools: multinomial logistic regression, feed-forward and recurrent neural networks such as gated recurrent units. This allows to embed sequential data and perform classification in the hyperbolic space. Empirically, we show that, even if hyperbolic optimization tools are limited, hyperbolic sentence embeddings either outperform or are on par with their Euclidean variants on textual entailment and noisy-prefix recognition tasks. …