Title : Memory-Driven Computing Abstract: Data is doubling every two years since 2013, and from 2018-2020, we will create 20 Zetabytes of data, which is as much as was previously created in the history of mankind. At the same time, Moore's law is hitting a wall, and Dennard Scaling, frequency increases and power drops due to size reduction, is over ten years gone. Per-core performance is flat and the number of cores are no longer increasing. This has lead to a wave of new computing paradigms, as evidenced by the usage of FPGAs and GPUs, and new classes of memory like the Memristor, and 3DXPoint. To realize this new world of mixed model compute, and new scalable memory, requires a new system architecture, and HPE's vision for this is called Memory-Driven Computing (MDC). The Machine is the multifaceted advanced development program focused on memory-driven architectures from Hewlett Packard Labs based on the MDC vision, that brings together byte-addressable non-volatile memory, photonic interconnects, and specialized SoCs in a vastly scalable "shared something" model. This Architecture can adapt to a massively parallel, shared nothing, scale-out model of a Hadoop/Spark environment, as well as the fully shared, scale-up model of an OLTP workload. As part of this program, we're building hardware, new OS features, new data stores, new analytics platforms, and new programming models. This talk will discuss the technologies that comprise the MDC vision, and their implications for systems software and application programs, as well as describe the work we're doing at HPE to address some of these challenges and opportunities.