Demo | LLM Inference on Intel® Data Center GPU Flex Series | Intel Software



Demo | LLM Inference on Intel® Data Center GPU Flex Series | Intel Software

Demo | LLM Inference on Intel® Data Center GPU Flex Series | Intel Software

Sheik Mohamed Imran, dGPU/AI Technical Solutions Specialist at Intel, showcases how to access and use the Big DL framework, specifically Big DL LLM, while working within the Intel Developer Cloud virtual sandbox. He demos how Intel Flex Series GPUs accelerate the process and teaches you how to run large language models on Intel platforms.

Visit the GitHub page to learn more: https://github.com/intel-analytics/BigDL/tree/main/python/llm

About Intel Software:
Intel® Developer Zone is committed to empowering and assisting software developers in creating applications for Intel hardware and software products. The Intel Software YouTube channel is an excellent resource for those seeking to enhance their knowledge. Our channel provides the latest news, helpful tips, and engaging product demos from Intel and our numerous industry partners. Our videos cover various topics; you can explore them further by following the links.

Connect with Intel Software:
INTEL SOFTWARE WEBSITE: https://intel.ly/2KeP1hD
INTEL SOFTWARE on FACEBOOK: http://bit.ly/2z8MPFF
INTEL SOFTWARE on TWITTER: http://bit.ly/2zahGSn
INTEL SOFTWARE GITHUB: http://bit.ly/2zaih6z
INTEL DEVELOPER ZONE LINKEDIN: http://bit.ly/2z979qs
INTEL DEVELOPER ZONE INSTAGRAM: http://bit.ly/2z9Xsby
INTEL GAME DEV TWITCH: http://bit.ly/2BkNshu

#intelsoftware

Demo | LLM Inference on Intel® Data Center GPU Flex Series | Intel Software