Daily NewsDigitalizationStorage Strategy

Equinix Engage Kuala Lumpur Putting Some Substance Behind the Recent Hype Around Data Centres

Data Storage Asia had the opportunity to attend Equinix Engage at Le Meridien in Kuala Lumpur on Thursday this week.

If you read the local business news, you’ve likely seen many big tech companies making numerous claims about their investment in data centre real estate in Malaysia. The news has been hyped up with talk about how much money these companies are investing in the region.

Amidst all this buzz, Equinix, one of the world’s leading data centre companies, has taken a grounded approach, establishing two data centres in Malaysia and continue expanding their investment  in Cyberjaya. Their focus has always been on partnering with international and local companies to provide the infrastructure and services they need to accelerate their technical and digital aspirations.

This commitment to infrastructure was evident at Equinix Engage, where the focus was less about hype and making headlines, and more about the business of laying the foundation for Malaysian organisations to excel in the brave new digital world. This event focused on really getting to grips with generative AI and understanding how to build the infrastructure that will enable AI projects to flourish. This was not about jumping on the “we are an AI company” bandwagon. Rather, Equinix proudly reminded us that they are an infrastructure company and sit in the unique position of helping organisations build the infrastructure to support their AI dreams (or hallucinations).

Equinix Malaysia Managing Director Tat Inn Cheam kicked off Equinix Engage, giving some background on Equinix globally and in Malaysia. For those who don’t know, Equinix is the world’s digital infrastructure company, powering 472,000 interconnections, providing customers access to 2,000+ network services, 3,000+ cloud and IT services, as well as 450+ content and digital media services.

IDC’s Simon Piff, who has a long history with Cheam going back to their time working together at DEC 28 years ago, gave a keynote presentation where he offered some ‘tough love’ and ‘sage advice.’ The tough love was that companies had no choice but to get moving on their AI adoption journeys. More than that, he felt that in this region, IT departments are falling behind when it comes to automating IT operations—something AI is increasingly going to handle for us. He asked the question, ‘Who feels that their IT department has enough employees?’ to which no one raised their hand. His point was that, as the demands on IT grow more complex, automating operational tasks is critical.

The sage advice was that, when it came to building generative AI (Gen AI) models, he noted how Gen AI is associated with massive datasets and daunting amounts of compute power. However, he assured the audience that when building models for your own company, the scale is not likely to be so huge and will be achievable on ‘standard’ cloud implementations.

The main session of Equinix Engage concluded with a discussion about how to build the infrastructure to accelerate Gen AI, moderated by our very own Group Publisher, Andrew Martin. He was joined on the panel by Simon Piff from IDC, Simon Lockington, Senior Director of Global Solution Architecture – APAC at Equinix, Hemanth Kalikiri, Managing Director of Cloud and Infrastructure Services – Asia at Capgemini and Yong Yoon Kit, Vice Chairman of the FMM Industry 4.0 Committee.

Equinix Engage
AOPG Group publisher Andre Martin aimed to pose the questions that many IT directors would have liked to have put to the panel experts. He took time to drill into their knowledge on how companies should take their first step into Gen AI projects.

The Equinix Engage discussion was wide-ranging, and refreshingly it was not about selling Equinix’s capabilities. Instead, the panel metaphorically “rolled up their sleeves” and focused on giving the audience advice and guidance about how and why to implement generative AI projects.

Yoon Kit explained that in the industry he represents, the reality of jumping into AI is not so straightforward. Many manufacturers are preoccupied with keeping the wheels turning in their day-to-day business, and it’s difficult to allocate resources or budgets to work on new AI projects. That said, he identified that in manufacturing, one use case for AI is already proving itself: predictive maintenance, which can save companies significant downtime—and in manufacturing, that equates to money.

Picking up on this, Simon Piff from IDC reminded us that our dialogue was not just about generative AI; the discussion also needed to include operational AI and predictive AI. Across the panel, it was generally agreed that defining aims for AI and getting wide buy-in for AI projects from business leaders outside of IT is important, more important is also setting expectations for what the results will look like.

Given that research shows as few as 30% of Gen AI projects make it from proof of concept to live implementation, it was interesting to get advice from the panel on best practices for embarking on a ‘first’ AI project. Simon Lockington from Equinix, encapsulated it really nicely when he explained that it’s better not to be too defined in your expected outcomes. Instead, jump right in and just do it—but start small, see the results, then repeat and expand from there. Hemanth from Capgemini agreed that starting small is vital, as it allows you to understand the kinds of results you can achieve and to discover the unique issues of your own situation at a more manageable pace.

On the issue of needing huge expensive resources (think banks of GPUs) for AI implementations, Simon Lockington reminded us that most company-specific AI projects will use only company data (we don’t all need to train our models on the entire internet like ChatGPT). As a result, the infrastructure investment is often much more modest than people imagine. Simon Piff backed that up, pointing to examples of companies he has engaged with that invested heavily in GPUs but aren’t using them all, and to the other extreme of a company using spare processor capacity on desktop workstations to run AI modelling through the night.

Simon also explained that, despite all the talk about Large Language Models (LLMs), when you boil things down to company-specific use cases, you enter the realm of Small Language Models (SLMs), which, as the name suggests, do not require the same level of infrastructure and compute to run.

Equinix Engage was an eye-opener for many who attended. Speaking to some of the attendees, they felt the panel dialogue could have gone on longer, but in terms of Equinix’s aims for the event, we would have to say, “Job done”. They got the IT leaders who attended thinking hard, and post-event, the dialogue will continue, with Equinix no doubt facilitating and helping Malaysian companies to take sensible steps to successfully implement AI that delivers real value.

Equinix Engage

DSA Editorial

The region’s leading specialist IT news publication focused on Data Lifecycle, Storage Infrastructure and Data-Driven Transformation. DSA has nearly 17,000 e-news subscribers, over 6500 unique visitors per day, over 20,000 social media followers and a reputation for deep domain knowledge.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *