This paper presents two complementary statistical computing frameworks that address challenges in parallel processing and the analysis of massive data. First, the foreach package allows users of the R programming environment to de ne parallel loops that may be run sequentially on a single machine, in parallel on a symmetric multiprocessing (SMP) machine, or in cluster environments without platform-speci c code. Second, the bigmemory package implements memory- and le-mapped data structures that provide (a) access to arbitrarily large data while retaining a look and feel that is familiar to R users and (b) data structures that are shared across processor cores in order to support e cient parallel computing techniques. Although these packages may be used independently, this paper shows how they can be used in combination to address challenges that have e ectively been beyond the reach of researchers who lack specialized software development skills or expensive hardware. Scalable Strategies for Computing with Massive Data