“Foundational models aren’t everything in AI,” Intel executive says

At Intel’s ConnectiON 2023 event, held in Bengaluru, Alexis Crowell, Intel’s VP and CTO for the APAC region, shared with The Hindu the chipmaker’s plans for the region.

Updated - December 15, 2023 05:37 pm IST

Published - December 15, 2023 03:53 pm IST

FILE PHOTO: Intel relies heavily on Indian engineers for developing AI-related hardware and software/ Special Arrangement

FILE PHOTO: Intel relies heavily on Indian engineers for developing AI-related hardware and software/ Special Arrangement

Intel has been making chipsets in the Asia Pacific region for over five decades. Recently, India has gained prominence due to its demographic advantage. Apart from local consumption, Intel relies heavily on Indian engineers for developing AI-related hardware and software. (Besides Israel, India is a central hub for building Intel’s Gaudi deep learning training processors meant for AI applications)

At Intel’s ConnectiON 2023 event, held in Bengaluru, Alexis Crowell, Intel’s VP and CTO for the APAC region, shared with The Hindu the chipmaker’s plans for the region

Edited excerpts:

There’s no update on Intel building a new fab in India yet but what is our role in the supply chain?

Alexis Crowell: We think India is incredibly important. Global supply chain resiliency is one of the most important things that we are looking at across the entire globe right, and India is a pillar of that. We’re investing upwards of $25 billion a year over the next five years just to try and ensure that the resiliency can exist. And it’s not just factory capacity, which is super important, but it’s also doing other things like ecosystem symposiums that bring component vendors and industry players together to make sure there’s harmony.

One of the things that we’ve done for decades, is to try and rally all of the ecosystem together to really make that advance and India plays a really big role in that especially with things like the Make in India initiative and some of the local ecosystem players here that are fantastic in growing their business and now’s the time for them to really be able to harness that and grow exponentially.

Will AI push custom made silicon even more? Or is that just a trend?

AC: I think that remains to be seen. People are producing custom silicon because they’re not getting the efficiencies that they need from other options and that drives innovation. Which is good, right? I think the question then becomes how programmable is that custom silicon and how much do you get the software ecosystem behind you.

What is Intel offering to an AI-enabled PC?

AC: One, it’s the first time there’s been three NPUs, a specific NPU AI processor, a GPU, and a CPU in one package. And what that means is that the form factor desktop gaming can start to process and separate the compute and the orchestration of software in a more efficient and effective way.

We’ve actually worked with AI on PCs for a while with the likes of Adobe and other companies, especially in the creative space, to use image recognition etc. But the current architecture allows multitasking and some of the other same time capabilities that you can do when you have a GPU and NPU and a CPU all sitting concurrently.

I am just super excited to see what developers [will] do. I think developers are some of the most innovative people in the world. And as soon as they get their hands on it, like I certainly don’t even know what they’re going to come up with. The easy ones are at is going to do be patient a heck of a lot easier. And it’s going to be instant, because they’re going to process all of it on my computer itself, sending it to the cloud. It saves companies massive amounts of money, because now you don’t have the bandwidth costs and sending into the cloud. You don’t have the cloud costs. You pay for your PC and now you’ve got the compute capability.

Q: Hardware is changing rapidly now to accommodate the high demands of compute in generative AI. What trends do you see in the near future?

AC: Although I might change your premise just slightly because I think what’s really happening is that communities are just figuring out how to harness compute power better. If we look at the large language models, they’re not built on bespoke chips. They’re built on things that have been around for very long, but it’s just that people have figured out how to do more with what’s available.

This just spins into what we used to call the software spiral. And the idea behind it is, as compute power gets better, the software starts taking more advantage of it and people innovate more and build really cool things, which then begets more compute power because it just continues to spiral. And I think that’s what we’re seeing happen in real time right now. But it’s actually not very different from what’s happened in the past. It’s just really visible.

Q: Several Big Tech firms are starting to make in-house AI chips to solve the compute problem. What are your thoughts on it?

AC: I think competition is good. It fuels innovation, and it keeps everybody putting their best foot forward. We love it on a couple of fronts. One, we’re building chips, and we’re selling them in that capacity. But we also have opened our factories and can build those chips on behalf of those customers. So, there’s actually a couple of ways that we play in that market. People can continue to iterate and they can continue to try and bring forth new features and new functionality.

Q. What are your thoughts on regulating AI?

AC: Regulation in technology requires a delicate balance because you don’t want to over regulate in a way that innovation gets stifled. You don’t want to stop people from being able to invent amazing new things. But you also want to put in some guardrails that help ensure the technology that is being built is good for humans.

I think that’s the balance that all of these governments are trying to bring right now. Private industry players and governments need to work together closely because if you don’t do it hand-in-hand, and the government doesn’t know what’s happening, and how can they regulate something and put guardrails in place?

Q: What’s your view on the future of AI?

AC: I have seen the perception around AI go through a drastic change since we started working with it. Even in 2017, we were using it but wanted to avoid mentioning it. Now, not only do we talk about it, but we can talk about it with pride,

Currently, we’re moving a little backwards as an industry, where we throw compute first and then try and pick the problem to solve versus looking for the biggest problems to solve.

Foundational models aren’t everything. There’s a lot of good things that can happen with those models. But I don’t want businesses and hospitals and governments to miss out on areas where AI can truly help them make a difference, because we’re so myopically focused on the bright shiny object.

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.