The Art of Data Science: Integrating Analytics, Design Thinking and Behavioural Insights
Data analytics is a scientific approach to help organisations solve problems, make better decisions, and increase productivity. Despite its business origins, analytics has been applied across sectors, spurning over a $130 billion market in 2016. However, a significant number of analytics projects fail due, in part, to poor science (techniques), art (e.g., implementation, change management) or both. This sharing covers the critical success factors for organisations embarking on their analytics journeys with special emphasis on the integration of analytics, design thinking and behavioural insights to achieve the best solutions.
The data science and AI renaissance are flourishing because of digitization, the data explosion, and the transformative impact that machine learning is having on data – namely, to enable new tasks to be performed. But while existing AI techniques have given us greater insight, we still do not have self-driving cars. This is because building AI systems involve more than learning how to perform a specific task from data; it requires an infrastructure. In this talk, Rob will take you through the steps you need to accelerate the journey to AI, including demos on how to best collect, organize, secure and analyze your data.
How Big Data Science is Changing the Face of Retail
Data and data science are essential to the development of innovative retailers like JD.com. As China’s largest retailer online or offline, JD leverages the latest research in data science to support its Boundaryless Retail vision, as well as to empower its partners and suppliers as part of its Retail as a Service (RaaS) strategy. Dr. Pei will elaborate on several of the company’s data and data science-related challenges and achievements such as big data-enabled smart supply chains and big data R&D platforms. He will also illustrate how an understanding of big data and data science challenges can enable more businesses to evolve and inspire game-changing opportunities and ideas.
Networking Coffee Break
Smarter Selling - A Data-driven Approach to Prioritising Prospective Restaurants
An insight into how Uber uses analytics to identify the gaps in restaurant coverage for the Uber EATS platform, and combines this information with internal and external data to identify and prioritise sales targets.
Deep Learning: The Next Machine Learning Frontier
Deep Learning has revolutionized our approach to machine learning over the last few years. Join us to learn more about what Deep Learning is, how Google uses it in our products, and how you can get started with Deep Learning using Google Cloud.
How to Make a Classifier without a Lot of Data
A problem for engineering is how to detect emerging issues in a product and fix them. Being able to find emerging issues can also help speed up the product development cycle. In emerging issues there are very few training examples for supervised learning and models are prone to over-fit. In this talk, we will review some of the ways to solve this problem including: simple models with a small number of parameters, reducing the feature set, tuning word vectors, and using transfer learning to leverage the power of big data to solve small data problems.
Networking Buffet Lunch
Data Democracy: The Distributed Sandbox Model
- Data Regimes: Striking a balance between absolute control and total chaos. - Designing a Data Architecture that enables Data Democracy. - Personal workstations powering distributed sandboxes. - Empowering users toward a Data Driven culture.
Predictive Analytics in Today's Era of Digital, Machine Learning, and AI: A Financial Industry Perspective
As society and banking is becoming more digitized and we are able to collect and analyze many digital footprints across internal and external data sources, on platforms that can compute huge amounts of data at relatively low costs - how are we leveraging this ‘perfect storm’ to significantly improve the way we engage both external (clients) and internal (employee’s) customers with automated, timely, more accurate and actionable insights at an individual level. Leveraging a few case studies, this presentation will give an overview on how we are rapidly transforming our predictive analytics capabilities and delivery platforms to personalize the experience for each customer.
Generate Actionable Insights from Texting Mining
Sephora + Science: A Data-Driven Approach to Understanding Your Customers
An overview of how Sephora uses big data and analytics to create a structured 360-view of customers, paving the way for data-driven marketing strategies.
Networking Coffee Break
Networking Coffee Break
Enabling Modern Data Architecture
Enterprises need to build Modern Data Architecture so they can manage batch, interactive and real-time workloads simultaneously over a central data set. What enterprises really wanted was that the further out to the point of origin they could go to bring the data under management, whether that be a medical device or piece of equipment that is moving, and as that data is generated, bring it under management, and track and process that data through its movement cycle and to be able to get insights, in real time, about what’s happening as that data is moving or their customers are engaging with the data. And giving the ability to actually process that data or take an action before event or transaction ever happens
Workshop: Big Data Connections at Tinder - Modern, Open, In the Cloud
Tinder has created its modern big data and analytics solution on top of Amazon AWS services and open source systems. This solution is capable of processing more than 20 billion events at the scale of tens of terabytes on a daily basis. The system has been running smoothly on production over one year with peak traffic over 400K events per second. It lays down a solid foundation for fast product innovations by business analysts, product managers, data scientists and machine learning engineers at Tinder.This talk will focus on what we have done to deliver this solution as well as several lessons learned over the course of time. We present both the architectural and operational aspects to illustrate how AWS services and open source systems are integrated to achieve performance, scalability, fault tolerance and etc. The AWS services include EC2, EMR, S3, Kinesis Streams, Kinesis Firehoses, Redshift and etc. The open source systems such as Finagle, Kubernetes, Spark, Flink, Kafka and Airflow are an integral part of this solution to leverage the passion and support from large developer communities.
Creating a Roadmap Towards Industry 4.0
Digitalization and Industry 4.0 have gained significant importance over the past few years. Although most companies have identified them as a strategic priorities, there are different interpretations and lack of clarity on how to approach the topic. As an early adopter of Industry 4.0, Henkel’s Adhesive Technologies has implemented 10 smart factories in Asia-Pacific. With a clear goal and a commitment to renew workplace skills and roles, Industry 4.0 can enable a production plant to go beyond delivering a product to creating value for customers.
Building a Safer City with Big Data and AI Solution
The world is not getting safer today. With the development of Internet, ICT technologies and Social Networks, we sees new form of threats emerging with new data sources. Terrorism remain one of the major issue in 21st century today. National Security Big data solutions today create new ways of identifying these cyber threats. An integrated Big Data and Artificial Intelligent Public Safety platform can help to prevent, detect and flight crimes with new approaches and ways, helping to build a safer city.
Revealing 360 View of Customer Digital Journey in Astro using Big Data Analytics
Astro has huge number of products spanning across multiple digital platforms like DTH, OTT, VOD, Audio, Radio, E-commerce, Wallet, etc. Millions of data points are generated daily from each product. The biggest challenge lies in bringing entire digital footprint of the customer under one hood. The first step towards collating data from such heterogeneous sources is to build a vast Data Lake. This data lake itself is heterogeneous in nature with bests of all the worlds – Open Source, AWS, Azure. This form of Centralized storage along with power of Analytics helps in increasing Customer Lifetime Value by analysing traits like Viewing Patterns, Demography, Geography, Transactions, Interactions, Behaviour etc. The outcome is a Single View with Personalized as well as Proactive Communication between Customer & Technology. The session aims to showcase how Big Data with Analytics plays a pivotal role in achieving this.
Onsite Registration & Light Breakfast
Data Driven Business Model and Insurance As A Service - An AXA Case Study
Discover how AXA is transforming its traditional insurance product into an Insurance-as-a-service. Using Big Data to create usage-based on-demand insurance and integrate with an ecosystem of partners in order to seamlessly offer the right coverage at the right time of its customer journey.
Real-Time Analysis At-Scale Using Big Data Fabric – A Case Study
Are you adopting big data analytics? Companies are investigating rapid analysis for business users using self-service BI on very large volumes of data. However, such initiatives are not yielding much value because these big data systems have become siloed from the rest of the enterprise systems, which hold critical business operational data. Big Data Fabric is a modern data architecture that combines data virtualization, data prep, and lineage capabilities to seamlessly integrate at scale these huge, siloed volumes of structured and unstructured data with other enterprise data assets. This presentation will demonstrate: · Using proven customer case studies, the value of using big data fabric as a logical data lake for big data analytics in big data and IoT initiatives. · The architectural stack of the big data fabric, the functions of each component, and the value delivered by each of them. · Performance benchmarks across the big data fabric technologies and at-scale optimization techniques for the lowest possible latency.
Implementing an Effective Big Data Strategy for Traditional and New Businesses
Businesses that have honed their craft in big data technologies have become industry game changers. Amazon, eBay, Lazada have transformed the retail industry through personalized product recommendation and customized consumer content because of their data-focused foundation. This presentation explores the relevant tools and techniques for implementing an effective data strategy across businesses of different maturity levels.
Networking Coffee Break
Create One Consistent Golden Source From Multiple Different Systems
Data integration: How do you bring everything together?·Classifying data for a structured golden source ·Eliminating inaccuracies ·Defining and maintaining the golden source with multiple systems ·Migrating systems from multiple areas to the golden source ·Developing common data identifiers ·Aligning your golden source with technology support ·Achieving consistency when you have a mixture of different data styles ·Tools and solutions for golden source creation
How Analytical Data Virtualization Will Change the Analytics Industry Over the Next 3 Years
Harnessing the Power of Big Data and Analytics in the Supply Chain and Logistics Industry
DHL is striving to make its parcel delivery business as efficient and as effective as possible. One thing they’re working on is to differentiate between deliveries to business, as opposed to residence, addresses. In this talk, Timothy will share how they manage to achieve this goal.
Networking Buffet Lunch
Unicorn Case Study: GO-JEK - Impacting a Nation
Drawing on her experience at GO-JEK, Crystal explains how the impossible can be made possible with technology and smart data insights. She will be speaking about how they have used big data in small ways, and big data in bigger ways to help GO-JEK and Indonesia move forward together. GO-JEK has had a surmountable impact on the nation by building innovative products and driving a new digital era in transport, logistics, lifestyle, and payments.
Data Science Challenges and Impact at Lazada
“Data is the New Oil”—how many times have you heard that? For e-commerce companies such as Lazada, data is indeed as plentiful as oil. Nonetheless, just like (crude) oil, there’re many considerations and steps before creating value from data.In this talk, Lazada shares some of the considerations and challenges faced in the development of its data products to create value for our customers and sellers, and improve the buying and selling experience. We’ll also share about some of the data products, their development process, and their outcomes.
Towards the Data Garden
Powered by leaps in big data infrastructure and ML algorithms, news of AI systems with superhuman performance excite people and stretches our imagination. The possibilities in a Data Garden, where innovation abound and ecosystems strive, seem boundless with huge investment pouring in from all sectors. As a forward-looking organisation, MAS has opened up discussions with the financial industry on ethical considerations of data, and leads the application of visualisation, ML and AI in transforming our work and galvanising the economy. A completeness of vision requires an ability to execute – would it be a vision, hallucination or nightmare? It’s our call.
Unicorn Case Study: The Data Science Behind Grabbing You A Ride
Ride-hailing platforms have been increasingly popular in recent years in providing passengers with a convenient avenue to book a ride. Grab – Southeast Asia’s largest ride-hailing provider - utilises a myriad of data mining techniques to optimise system efficiency through smart allocation of rides, so that passengers can get a ride when they need one. This talk presents some of the data-driven approaches - predictive modelling, deep learning, optimization and simulation - used in the design, implementation and continuous improvement of the transport services on our platform.
Networking Coffee Break
Networking Coffee Break
Enterprise In-House Open Source AI
The numbers on your AI market size report are probably wrong. You are getting information about data science, machine learning and AI from parties with a vested interest. Misinformation may be passed off as good marketing. You have spent your money and political capital on software and consultants, but if you take an honest look at your P&L, you have bought PR instead of real ROI. Meanwhile, the gap between technology companies and traditional MNCs is opening ever wider. In this presentation, we discuss useful handles to distinguish AI from other analytics capabilities, common dysfunctions that cause enterprises to stumble in developing an in house data science capabilities, and major issues around the emerging AI space, along with the opportunity for organizations to step up and take responsibility to be competent and ethical practitioners of AI.
How Big Data Drives Decision Making in Chope
Chope - a platform for restaurant discovery, booking and deals, operates in 5 countries today and lists more than 3000 restaurants on its network. With both diner and restaurant facing product lines, Chope is able to best-match hungry diner demands to available restaurant tables. This talk focuses on how Chope marries the millions of events generated across the multiple product lines to help diners discover, and receive instant confirmation on their reservations without any restaurant intervention.
Category Suggestions based on Deep Learning
Ever wonder how does e-commerce platforms like Shopee categorise millions and millions of listings? Yaozhang, the Data Science Lead at Shopee will take a deep-dive into the mechanics of Machine Learning in relation to category suggestions based on deep learning. Find out what happens when a seller uploads his/her listings Discover the Data Science Ninja: tools, models and skillsets from our Data Science weaponry Data Science Opportunities with the South East Asia e-Commerce giant
Workshop: High-Performance Spark
Spark is a powerful, distributed computing framework. Spark can perform up to 100X faster than Map Reduce because of its ability to hold results in RAM as opposed to writing to disk. In this workshop, we will set up a Spark cluster, and discuss the choice between data joins in Core Spark and Spark SQL, how to optimize data join performance, and writing high-performance Spark code. At the end of the workshop, we will use Spark Streaming to analyze tweets.