Announcing TensorFlow 1.0

In just its first year, TensorFlow has helped researchers, engineers, artists, students, and many others make progress with everything from language translation to early detection of skin cancer and preventing blindness in diabetics. We’re excited to see people using TensorFlow in over 6000 open-source repositories online. Today, as part of the first annual TensorFlow Developer Summit, hosted in Mountain View and livestreamed around the world, we’re announcing TensorFlow 1.0:
• It’s faster
• It’s more flexible
• It’s more production-ready than ever


Performance improvements coming to R 3.4.0

R 3.3.3 (codename: ‘Another Canoe’) is scheduled for release on March 6. This is the ‘wrap-up’ release of the R 3.3 series, which means it will include minor bug fixes and improvements, but eschew major new features. Major changes are coming though, with the subsequent release of R 3.4.0. While the NEWS file announcing updates in 3.4.0 is still subject to change, it indicates several major changes aimed at improving the performance of R in various ways: …


Ehlers’s Autocorrelation Periodogram

This post will introduce John Ehlers’s Autocorrelation Periodogram mechanism-a mechanism designed to dynamically find a lookback period. That is, the most common parameter optimized in backtests is the lookback period.


Multilabel classification with neuralnet package

Some time ago I wrote an article on how to use a simple neural network in R with the neuralnet package to tackle a regression task. Since then, however, I turned my attention to other libraries such as MXNet, mainly because I wanted something more than what the neuralnet package provides (for starters, convolutional neural networks and, why not, recurrent neural networks). A few weeks ago, however, I was asked how to use the neuralnet package for making a multilabel classifier. I wrote a quick script as an example and thought I could write a short article on it, furthermore I think a classification tutorial using the neuralnet package could be complementary to the one I did on regression. The neuralnet package is perhaps not the best option in R for using neural networks. If you ask why, for starters it does not recognize the typical formula y~., it does not support factors, it does not provide a lot of models other than a standard MLP, and it has great competitors in the nnet package that seems to be better integrated in R and can be used with the caret package, and in the MXnet package that is a high level deep learning library which provides a wide variety of neural networks. But still, I think there is some value in the ease of use of the neuralnet package, especially for a beginner, therefore I’ll be using it.


Calculating required sample size in R and SAS

Today we are going to digress from our ongoing “Intro to R” series, and talk about a subject that’s been on my mind lately: sample sizes. An important question when designing an experiment is “How big a sample do I need?” A larger sample will give more accurate results, but at a cost. Use too small a sample, and you may get inconclusive results; too large a sample, and you’re wasting resources. To calculate the required sample size, you’ll need to know four things:
1. The size of the response you want to detect
2. The variance of the response
3. The desired significance level
4. The desired power


Introduction to ggraph: Edges

This is the third post in my series of ggraph introductions. The first post introduced the concept of layouts, which is simply a specification on how nodes should be placed on a plane. The second explained how to draw nodes using the geom_node_*() family. This post will connect the dots, so to speak, by introducing the concept of edges.