Mark Wade
Mark is the Chief Executive Officer and Co-Founder of Ayar Labs. His prior roles at Ayar Labs include Chief Technology Officer and Senior Vice President of Engineering. He is recognized as a pioneer in photonics technologies and, before founding the company, led the team that designed the optics in the world's first processor to communicate using light. He and his co-founders invented breakthrough technology at MIT and UC Berkeley from 2010-2015, which led to the formation of Ayar Labs. He holds a PhD from the University of Colorado.
In today’s AI infrastructure, traditional copper and pluggable optics are ineffective in scaling package-level compute advancements to the system rack and row levels, leading to low efficiency, high power consumption, and high costs. New technologies are needed to support growing model sizes and complexity. Ayar Labs' in-package optical I/O solution enables peak platform performance by providing efficient, low-cost scaling at the rack and row levels. It also offers extended accelerator memory to optimize the balance between memory and compute. In this presentation, Mark Wade will show application-level improvements in performance and TCO metrics, such as productivity, profitability, and interactivity, using optical I/O-based scale-up fabrics for inference and training.
Mark Wade
Mark is the Chief Executive Officer and Co-Founder of Ayar Labs. His prior roles at Ayar Labs include Chief Technology Officer and Senior Vice President of Engineering. He is recognized as a pioneer in photonics technologies and, before founding the company, led the team that designed the optics in the world's first processor to communicate using light. He and his co-founders invented breakthrough technology at MIT and UC Berkeley from 2010-2015, which led to the formation of Ayar Labs. He holds a PhD from the University of Colorado.
Ayar Labs
Website: https://ayarlabs.com/
Ayar Labs is transforming AI infrastructure by accelerating data movement. Recognizing that the complexity and size of AI models are increasing at a rate that traditional interconnect technology cannot handle, the company has developed the industry’s first optical I/O solution that enables customers to maximize the compute efficiency and performance of growing AI infrastructure, while reducing costs, latency and power consumption. Based on open standards and optimized for both AI training and inference, Ayar Labs’ optical I/O solution is backed by a robust ecosystem that enables it to integrate smoothly into AI systems at scale.
Preet Virk
Celestial AI
Website: https://www.celestial.ai/
With the growth in Generative AI, data center infrastructure it is not just about the System on Chip but about the System of Chips. In the era of Accelerated Computing, data center bottlenecks are no longer limited to compute performance, but rather the system’s interconnect bandwidth, memory bandwidth, and memory capacity. Celestial AI's Photonic FabricTM is the next-generation interconnect technology offering a10X increase in performance and energy efficiency over competitive technologies.
The Photonic Fabric™ is available to our customers in multiple technology offerings, including optical chiplets, optical interposers, and Optical Multi-chip Interconnect Bridges (OMIB). This enables our customers to seamlessly integrate high bandwidth, low power, low latency optical interfaces into their AI accelerators and GPUs . The technology is fully compatible with both logical layers (protocols) and physical layers, including standard 2.5D packaging flows. This ease of integration enables XPUs to have optical interconnects for compute-to-compute and compute-to-memory fabrics that deliver tens of Tbps bandwidth with nano-second latencies.
This innovation empowers hyperscalers to improve the efficiency and economics of AI processing by optimizing the XPUs needed for training and inference and significantly lowering the TCO2 impact. To support customer engagements, Celestial AI is cultivating a Photonic Fabric ecosystem. These tier-1 partnerships consist of custom silicon/ASIC design services, system integrators, HBM memory, assembly, and packaging suppliers
Zaid Kahn
Zaid is currently a VP in Microsoft’s Silicon, Cloud Hardware, and Infrastructure Engineering organization where he leads systems engineering and hardware development for Azure including AI systems and infrastructure. Zaid is part of the technical leadership team across Microsoft that sets AI hardware strategy for training and inference. Zaid's teams are also responsible for software and hardware engineering efforts developing specialized compute systems, FPGA network products and ASIC hardware accelerators.
Prior to Microsoft Zaid was head of infrastructure at LinkedIn where he was responsible for all aspects of architecture and engineering for Datacenters, Networking, Compute, Storage and Hardware. Zaid also led several software development teams focusing on building and managing infrastructure as code. This included zero touch provisioning, software-defined networking, network operating systems (SONiC, OpenSwitch), self-healing networks, backbone controller, software defined storage and distributed host-based firewalls. The network teams Zaid led built the global network for LinkedIn, including POP's, peering for edge services, IPv6 implementation, DWDM infrastructure and datacenter network fabric. The hardware and datacenter engineering teams Zaid led were responsible for water cooling to the racks, optical fiber infrastructure and open hardware development which was contributed to the Open Compute Project Foundation (OCP).
Zaid holds several patents in networking and is a sought-after keynote speaker at top tier conferences and events. Zaid is currently the chairperson for the OCP Foundation Board. He is also currently on the EECS External Advisory Board (EAB) at UC Berkeley and a board member of Internet Ecosystem Innovation Committee (IEIC), a global internet think tank promoting internet diversity. Zaid has a Bachelor of Science in Computer Science and Physics from the University of the South Pacific.
SK Telecom and Singtel partner to develop next-generation telco technology and solutions
HONOR Unveils Industry's First AI Defocus Eye Protection and AI Deepfake Detection
Apple secures a seat on OpenAI's board
Siemens launches Solido Simulation Suite
In this enlightening panel, we explore the technical innovations that create trusted execution environments and foster data partnerships across functions, departments, and organizations, fostering a trusted and resilient data economy. Key highlights include:
Collaboration Between PETs and Confidential Computing:
Understand the need for synergy between these technologies to cater to enterprise-level adoption. Explore how PETs and confidential computing enhance data integrity and security.
Unlocking Sensitive Data Sets:
Whether dealing with personal identifiable information, health records, or financial reports, data integrity is invaluable. Discover a privacy toolkit that securely unlocks these data sets, empowering organizations to navigate privacy complexities.
Enhancing Trust and Resilience in Data Economy:
Embrace these innovations to create a stronger, transparent data ecosystem that thrives on trust and resilience.
Attendees will leave equipped with the understanding of how PETs and confidential computing go hand-in-hand and how these technologies increase data utility and data integrity, ultimately contributing to a stronger, more transparent data ecosystem that thrives on trust and resilience.
Jay Prakash
Silence Laboratories
Website: https://www.silencelaboratories.com/
Silence Laboratories, a privacy technology and infrastructure company based in Singapore, provides algorithms and product suites using distributed computation and authorization technologies, particularly multi-party computation (MPC).
Silence Laboratories (SL) boasts one of the fastest threshold signatures and authorization libraries in production, safeguarding digital assets worth billions. SL’s libraries and infrastructure enable enterprises to collaborate on data without transferring it to a trusted party, thereby opening avenues for monetization and the development of innovative products. With partners across various sectors including finance, digital assets, and data industries, SL is creating a data-sharing and computation ecosystem where consent and privacy-preserving analysis are mathematically coupled, ensuring compliance and security. SL is a preferred choice by enterprises owing to its developer-friendly libraries and application-agnostic privacy-preserving collaboration tools.