Symbolic Program Slicing (SymPas)
Program slicing is a technique for simplifying programs by focusing on selected aspects of their behaviour. Current mainstream static slicing methods operate on the PDG (program dependence graph) or SDG (system dependence graph), but these friendly graph representations may be expensive and error-prone for some users. We attempt in this paper to study a light-weight approach of static program slicing, called Symbolic Program Slicing (SymPas), which works as a dataflow analysis on LLVM (Low-Level Virtual Machine). In our SymPas approach, slices are stored symbolically rather than procedure being re-analysed (cf. procedure summaries). Instead of re-analysing a procedure multiple times to find its slices for each callling context, SymPas calculates a single symbolic (or parameterized) slice which can be instantiated at call sites avoiding re-analysis; it is implemented in LLVM to perform slicing on its intermediate representation (IR). For comparison, we systematically adapt IFDS (Interprocedural Finite Distributive Subset) analysis and the SDG-based slicing method (SDG-IFDS) to statically IR slice programs. Evaluated on open-source and benchmark programs, our backward SymPas shows a factor-of-6 reduction in time cost and a factor-of-4 reduction in space cost, compared to backward SDG-IFDS, thus being more efficient. In addition, the result shows that after studying slices from 66 programs, ranging up to 336,800 IR instructions in size, SymPas is highly size-scalable. …
Imagination-Augmented Agents (I2A)
We introduce Imagination-Augmented Agents (I2As), a novel architecture for deep reinforcement learning combining model-free and model-based aspects. In contrast to most existing model-based reinforcement learning and planning methods, which prescribe how a model should be used to arrive at a policy, I2As learn to interpret predictions from a learned environment model to construct implicit plans in arbitrary ways, by using the predictions as additional context in deep policy networks. I2As show improved data efficiency, performance, and robustness to model misspecification compared to several baselines. …
Bias
Statistical bias is a feature of a statistical technique or of its results whereby the expected value of the results differs from the true underlying quantitative parameter being estimated. …
Network Deconvolution
Convolution is a central operation in Convolutional Neural Networks (CNNs), which applies a kernel or mask to overlapping regions shifted across the image. In this work we show that the underlying kernels are trained with highly correlated data, which leads to co-adaptation of model weights. To address this issue we propose what we call network deconvolution, a procedure that aims to remove pixel-wise and channel-wise correlations before the data is fed into each layer. We show that by removing this correlation we are able to achieve better convergence rates during model training with superior results without the use of batch normalization on the CIFAR-10, CIFAR-100, MNIST, Fashion-MNIST datasets, as well as against reference models from ‘model zoo’ on the ImageNet standard benchmark. …
If you did not already know
28 Monday Mar 2022
Posted What is ...
in