AVIATOR google
This paper presents a visual tool, AVIATOR, that integrates the progressive visual analytics paradigm in the IR evaluation process. This tool serves to speed-up and facilitate the performance assessment of retrieval models enabling a result analysis through visual facilities. AVIATOR goes one step beyond the common ‘compute wait visualize’ analytics paradigm, introducing a continuous evaluation mechanism that minimizes human and computational resource consumption. …

Machine Teaching google
In this paper, we consider the problem of machine teaching, the inverse problem of machine learning. Different from traditional machine teaching which views the learners as batch algorithms, we study a new paradigm where the learner uses an iterative algorithm and a teacher can feed examples sequentially and intelligently based on the current performance of the learner. We show that the teaching complexity in the iterative case is very different from that in the batch case. Instead of constructing a minimal training set for learners, our iterative machine teaching focuses on achieving fast convergence in the learner model. Depending on the level of information the teacher has from the learner model, we design teaching algorithms which can provably reduce the number of teaching examples and achieve faster convergence than learning without teachers. We also validate our theoretical findings with extensive experiments on different data distribution and real image datasets. …

Deep Factor Alpha google
Deep Factor Alpha provides a framework for extracting nonlinear factors information to explain the time-series cross-section properties of asset returns. Sorting securities based on firm characteristics is viewed as a nonlinear activation function which can be implemented within a deep learning architecture. Multi-layer deep learners are constructed to augment traditional long-short factor models. Searching firm characteristic space over deep architectures of nonlinear transformations is compatible with the economic goal of eliminating mispricing Alphas. Joint estimation of factors and betas is achieved with stochastic gradient descent. To illustrate our methodology, we design long-short latent factors in a train-validation-testing framework of US stock market asset returns from 1975 to 2017. We perform an out-of-sample study to analyze Fama-French factors, in both the cross-section and time-series, versus their deep learning counterparts. Finally, we conclude with directions for future research. …

WikiAtomicEdits google
We release a corpus of 43 million atomic edits across 8 languages. These edits are mined from Wikipedia edit history and consist of instances in which a human editor has inserted a single contiguous phrase into, or deleted a single contiguous phrase from, an existing sentence. We use the collected data to show that the language generated during editing differs from the language that we observe in standard corpora, and that models trained on edits encode different aspects of semantics and discourse than models trained on raw, unstructured text. We release the full corpus as a resource to aid ongoing research in semantics, discourse, and representation learning. …