找回密码
 立即注册

QQ登录

只需一步,快速开始

搜索
热搜: 活动 交友 discuz
查看: 62|回复: 0

https://github.com/intel/ipex-llm

[复制链接]

115

主题

75

回帖

764

积分

高级会员

积分
764
发表于 2026-1-28 11:44:13 | 显示全部楼层 |阅读模式
本帖最后由 rafavi 于 2026-1-28 11:46 编辑

[url=https://github.com/intel/ipex-llm/blob/main/docs/mddocs/Quickstart/install_windows_gpu.md]https://github.com/intel/ipex-llm[/url]
Install ipex-llm
With the llm environment active, use pip to install ipex-llm for GPU:
Choose either US or CN website for extra-index-url:
Note
If you encounter network issues while installing IPEX, refer to this guide for troubleshooting advice.

Verify Installation
You can verify if ipex-llm is successfully installed following below steps.
Step 1: Runtime Configurations
  • Open the Miniforge Prompt and activate the Python environment llm you previously created:
    conda activate llm

  • Set the following environment variables according to your device:

    • For Intel iGPU and Intel Arc™ A770:
      set SYCL_CACHE_PERSISTENT=1




Tip
For other Intel dGPU Series, please refer to this guide for more details regarding runtime configuration.

Step 2: Run Python Code
  • Launch the Python interactive shell by typing python in the Miniforge Prompt window and then press Enter.
  • Copy following code to Miniforge Prompt line by line and press Enter after copying each line.
    import torch from ipex_llm.transformers import AutoModel,AutoModelForCausalLM    tensor_1 = torch.randn(1, 1, 40, 128).to('xpu') tensor_2 = torch.randn(1, 1, 128, 40).to('xpu') print(torch.matmul(tensor_1, tensor_2).size())

    It will output following content at the end:
    torch.Size([1, 1, 40, 40])

    Tip:
    If you encounter any problem, please refer to here for help.
  • To exit the Python interactive shell, simply press Ctrl+Z then press Enter (or input exit() then press Enter).



回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则


QQ|Archiver|手机版|小黑屋|Bluetooth-UWB 联盟论坛 ( 京ICP备19003900号-5 )

GMT+8, 2026-2-13 22:14 , Processed in 0.035435 second(s), 26 queries .

Powered by Discuz! X3.5

© 2001-2025 Discuz! Team.

快速回复 返回顶部 返回列表