01/21 15:15:54 - OpenCompass - INFO - Start inferencing [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-high_school_geography]
100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 19/19 [00:00<00:00, 527760.11it/s]
[2024-01-21 15:15:54,813] [opencompass.openicl.icl_inferencer.icl_gen_inferencer] [INFO] Starting inference process...
100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 5/5 [00:08<00:00, 1.65s/it]
01/21 15:16:09 - OpenCompass - INFO - Start inferencing [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-ideological_and_moral_cultivation]
100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 19/19 [00:00<00:00, 758969.30it/s]
[2024-01-21 15:16:09,984] [opencompass.openicl.icl_inferencer.icl_gen_inferencer] [INFO] Starting inference process...
100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 5/5 [00:06<00:00, 1.25s/it]
01/21 15:16:17 - OpenCompass - INFO - Start inferencing [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-high_school_chinese]
100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 19/19 [00:00<00:00, 744782.95it/s]
[2024-01-21 15:16:17,618] [opencompass.openicl.icl_inferencer.icl_gen_inferencer] [INFO] Starting inference process...
100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 5/5 [00:24<00:00, 4.87s/it]
01/21 15:16:42 - OpenCompass - INFO - Start inferencing [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-sports_science]
100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 19/19 [00:00<00:00, 705236.96it/s]
[2024-01-21 15:16:42,099] [opencompass.openicl.icl_inferencer.icl_gen_inferencer] [INFO] Starting inference process...
100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 5/5 [00:06<00:00, 1.38s/it]
01/21 15:16:49 - OpenCompass - INFO - Start inferencing [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-basic_medicine]
100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 19/19 [00:00<00:00, 711533.71it/s]
[2024-01-21 15:16:49,085] [opencompass.openicl.icl_inferencer.icl_gen_inferencer] [INFO] Starting inference process...
100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 5/5 [00:06<00:00, 1.34s/it]
01/21 15:16:55 - OpenCompass - INFO - Start inferencing [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-probability_and_statistics]
100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 18/18 [00:00<00:00, 571950.55it/s]
[2024-01-21 15:16:55,928] [opencompass.openicl.icl_inferencer.icl_gen_inferencer] [INFO] Starting inference process...
100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 5/5 [00:23<00:00, 4.61s/it]
01/21 15:17:19 - OpenCompass - INFO - Start inferencing [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-high_school_mathematics]
100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 18/18 [00:00<00:00, 692637.36it/s]
[2024-01-21 15:17:19,146] [opencompass.openicl.icl_inferencer.icl_gen_inferencer] [INFO] Starting inference process...
100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 5/5 [00:16<00:00, 3.22s/it]
01/21 15:17:35 - OpenCompass - INFO - Start inferencing [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-discrete_mathematics]
100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:00<00:00, 599186.29it/s]
[2024-01-21 15:17:35,369] [opencompass.openicl.icl_inferencer.icl_gen_inferencer] [INFO] Starting inference process...
100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4/4 [00:06<00:00, 1.74s/it]
01/21 15:17:42 - OpenCompass - INFO - Start inferencing [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-middle_school_geography]
100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 12/12 [00:00<00:00, 483958.15it/s]
[2024-01-21 15:17:42,407] [opencompass.openicl.icl_inferencer.icl_gen_inferencer] [INFO] Starting inference process...
100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 3/3 [00:05<00:00, 1.69s/it]
01/21 15:17:47 - OpenCompass - INFO - time elapsed: 910.91s
01/21 15:17:54 - OpenCompass - DEBUG - Get class `NaivePartitioner` from "partitioner" registry in "opencompass"
01/21 15:17:54 - OpenCompass - DEBUG - An `NaivePartitioner` instance is built from registry, and its implementation can be found in opencompass.partitioners.naive
01/21 15:17:54 - OpenCompass - DEBUG - Key eval.runner.task.judge_cfg not found in config, ignored.
01/21 15:17:54 - OpenCompass - DEBUG - Key eval.runner.task.dump_details not found in config, ignored.
01/21 15:17:54 - OpenCompass - DEBUG - Additional config: {'eval': {'runner': {'task': {}}}}
01/21 15:17:54 - OpenCompass - INFO - Partitioned into 52 tasks.
01/21 15:17:54 - OpenCompass - DEBUG - Task 0: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-computer_network]
01/21 15:17:54 - OpenCompass - DEBUG - Task 1: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-operating_system]
01/21 15:17:54 - OpenCompass - DEBUG - Task 2: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-computer_architecture]
01/21 15:17:54 - OpenCompass - DEBUG - Task 3: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-college_programming]
01/21 15:17:54 - OpenCompass - DEBUG - Task 4: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-college_physics]
01/21 15:17:54 - OpenCompass - DEBUG - Task 5: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-college_chemistry]
01/21 15:17:54 - OpenCompass - DEBUG - Task 6: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-advanced_mathematics]
01/21 15:17:54 - OpenCompass - DEBUG - Task 7: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-probability_and_statistics]
01/21 15:17:54 - OpenCompass - DEBUG - Task 8: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-discrete_mathematics]
01/21 15:17:54 - OpenCompass - DEBUG - Task 9: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-electrical_engineer]
01/21 15:17:54 - OpenCompass - DEBUG - Task 10: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-metrology_engineer]
01/21 15:17:54 - OpenCompass - DEBUG - Task 11: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-high_school_mathematics]
01/21 15:17:54 - OpenCompass - DEBUG - Task 12: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-high_school_physics]
01/21 15:17:54 - OpenCompass - DEBUG - Task 13: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-high_school_chemistry]
01/21 15:17:54 - OpenCompass - DEBUG - Task 14: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-high_school_biology]
01/21 15:17:54 - OpenCompass - DEBUG - Task 15: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-middle_school_mathematics]
01/21 15:17:54 - OpenCompass - DEBUG - Task 16: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-middle_school_biology]
01/21 15:17:54 - OpenCompass - DEBUG - Task 17: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-middle_school_physics]
01/21 15:17:54 - OpenCompass - DEBUG - Task 18: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-middle_school_chemistry]
01/21 15:17:54 - OpenCompass - DEBUG - Task 19: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-veterinary_medicine]
01/21 15:17:54 - OpenCompass - DEBUG - Task 20: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-college_economics]
01/21 15:17:54 - OpenCompass - DEBUG - Task 21: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-business_administration]
01/21 15:17:54 - OpenCompass - DEBUG - Task 22: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-marxism]
01/21 15:17:54 - OpenCompass - DEBUG - Task 23: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-mao_zdong_thought]
01/21 15:17:54 - OpenCompass - DEBUG - Task 24: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-education_science]
01/21 15:17:54 - OpenCompass - DEBUG - Task 25: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-teacher_qualification]
01/21 15:17:54 - OpenCompass - DEBUG - Task 26: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-high_school_politics]
01/21 15:17:54 - OpenCompass - DEBUG - Task 27: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-high_school_geography]
01/21 15:17:54 - OpenCompass - DEBUG - Task 28: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-middle_school_politics]
01/21 15:17:54 - OpenCompass - DEBUG - Task 29: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-middle_school_geography]
01/21 15:17:54 - OpenCompass - DEBUG - Task 30: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-modern_chinese_history]
01/21 15:17:54 - OpenCompass - DEBUG - Task 31: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-ideological_and_moral_cultivation]
01/21 15:17:54 - OpenCompass - DEBUG - Task 32: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-logic]
01/21 15:17:54 - OpenCompass - DEBUG - Task 33: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-law]
01/21 15:17:54 - OpenCompass - DEBUG - Task 34: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-chinese_language_and_literature]
01/21 15:17:54 - OpenCompass - DEBUG - Task 35: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-art_studies]
01/21 15:17:54 - OpenCompass - DEBUG - Task 36: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-professional_tour_guide]
01/21 15:17:54 - OpenCompass - DEBUG - Task 37: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-legal_professional]
01/21 15:17:54 - OpenCompass - DEBUG - Task 38: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-high_school_chinese]
01/21 15:17:54 - OpenCompass - DEBUG - Task 39: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-high_school_history]
01/21 15:17:54 - OpenCompass - DEBUG - Task 40: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-middle_school_history]
01/21 15:17:54 - OpenCompass - DEBUG - Task 41: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-civil_servant]
01/21 15:17:54 - OpenCompass - DEBUG - Task 42: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-sports_science]
01/21 15:17:54 - OpenCompass - DEBUG - Task 43: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-plant_protection]
01/21 15:17:54 - OpenCompass - DEBUG - Task 44: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-basic_medicine]
01/21 15:17:54 - OpenCompass - DEBUG - Task 45: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-clinical_medicine]
01/21 15:17:54 - OpenCompass - DEBUG - Task 46: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-urban_and_rural_planner]
01/21 15:17:54 - OpenCompass - DEBUG - Task 47: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-accountant]
01/21 15:17:54 - OpenCompass - DEBUG - Task 48: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-fire_engineer]
01/21 15:17:54 - OpenCompass - DEBUG - Task 49: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-environmental_impact_assessment_engineer]
01/21 15:17:54 - OpenCompass - DEBUG - Task 50: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-tax_accountant]
01/21 15:17:54 - OpenCompass - DEBUG - Task 51: [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-physician]
01/21 15:17:54 - OpenCompass - DEBUG - Get class `LocalRunner` from "runner" registry in "opencompass"
01/21 15:17:54 - OpenCompass - DEBUG - An `LocalRunner` instance is built from registry, and its implementation can be found in opencompass.runners.local
01/21 15:17:54 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:17:54 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:19:00 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-computer_network]: {'accuracy': 31.57894736842105}
01/21 15:19:00 - OpenCompass - INFO - time elapsed: 32.40s
01/21 15:19:00 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:19:00 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:19:45 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-operating_system]: {'accuracy': 36.84210526315789}
01/21 15:19:45 - OpenCompass - INFO - time elapsed: 22.78s
01/21 15:19:45 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:19:45 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:20:27 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-computer_architecture]: {'accuracy': 28.57142857142857}
01/21 15:20:27 - OpenCompass - INFO - time elapsed: 20.34s
01/21 15:20:28 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:20:28 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:21:02 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-college_programming]: {'accuracy': 32.432432432432435}
01/21 15:21:02 - OpenCompass - INFO - time elapsed: 16.26s
01/21 15:21:03 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:21:03 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:21:36 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-college_physics]: {'accuracy': 26.31578947368421}
01/21 15:21:36 - OpenCompass - INFO - time elapsed: 16.82s
01/21 15:21:37 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:21:37 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:22:03 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-college_chemistry]: {'accuracy': 16.666666666666664}
01/21 15:22:03 - OpenCompass - INFO - time elapsed: 13.34s
01/21 15:22:04 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:22:04 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:22:29 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-advanced_mathematics]: {'accuracy': 21.052631578947366}
01/21 15:22:29 - OpenCompass - INFO - time elapsed: 11.90s
01/21 15:22:29 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:22:29 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:22:55 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-probability_and_statistics]: {'accuracy': 38.88888888888889}
01/21 15:22:55 - OpenCompass - INFO - time elapsed: 13.46s
01/21 15:22:56 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:22:56 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:23:21 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-discrete_mathematics]: {'accuracy': 18.75}
01/21 15:23:21 - OpenCompass - INFO - time elapsed: 12.30s
01/21 15:23:22 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:23:22 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:23:47 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-electrical_engineer]: {'accuracy': 35.13513513513514}
01/21 15:23:47 - OpenCompass - INFO - time elapsed: 11.45s
01/21 15:23:48 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:23:48 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:24:13 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-metrology_engineer]: {'accuracy': 50.0}
01/21 15:24:13 - OpenCompass - INFO - time elapsed: 11.53s
01/21 15:24:13 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:24:13 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:24:37 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-high_school_mathematics]: {'accuracy': 22.22222222222222}
01/21 15:24:37 - OpenCompass - INFO - time elapsed: 10.91s
01/21 15:24:37 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:24:37 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:24:57 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-high_school_physics]: {'accuracy': 31.57894736842105}
01/21 15:24:57 - OpenCompass - INFO - time elapsed: 10.09s
01/21 15:24:58 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:24:58 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:25:20 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-high_school_chemistry]: {'accuracy': 15.789473684210526}
01/21 15:25:20 - OpenCompass - INFO - time elapsed: 9.58s
01/21 15:25:21 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:25:21 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:25:40 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-high_school_biology]: {'accuracy': 36.84210526315789}
01/21 15:25:40 - OpenCompass - INFO - time elapsed: 9.36s
01/21 15:25:41 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:25:41 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:26:02 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-middle_school_mathematics]: {'accuracy': 26.31578947368421}
01/21 15:26:02 - OpenCompass - INFO - time elapsed: 10.00s
01/21 15:26:02 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:26:02 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:26:24 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-middle_school_biology]: {'accuracy': 61.904761904761905}
01/21 15:26:24 - OpenCompass - INFO - time elapsed: 10.50s
01/21 15:26:24 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:26:24 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:26:45 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-middle_school_physics]: {'accuracy': 63.1578947368421}
01/21 15:26:45 - OpenCompass - INFO - time elapsed: 8.96s
01/21 15:26:45 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:26:45 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:27:05 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-middle_school_chemistry]: {'accuracy': 60.0}
01/21 15:27:05 - OpenCompass - INFO - time elapsed: 9.35s
01/21 15:27:06 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:27:06 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:27:28 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-veterinary_medicine]: {'accuracy': 47.82608695652174}
01/21 15:27:28 - OpenCompass - INFO - time elapsed: 10.60s
01/21 15:27:29 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:27:29 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:27:51 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-college_economics]: {'accuracy': 41.81818181818181}
01/21 15:27:51 - OpenCompass - INFO - time elapsed: 10.13s
01/21 15:27:52 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:27:52 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:28:10 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-business_administration]: {'accuracy': 33.33333333333333}
01/21 15:28:10 - OpenCompass - INFO - time elapsed: 7.27s
01/21 15:28:10 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:28:10 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:28:27 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-marxism]: {'accuracy': 68.42105263157895}
01/21 15:28:27 - OpenCompass - INFO - time elapsed: 7.32s
01/21 15:28:28 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:28:28 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:28:52 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-mao_zdong_thought]: {'accuracy': 70.83333333333334}
01/21 15:28:52 - OpenCompass - INFO - time elapsed: 11.94s
01/21 15:28:53 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:28:53 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:29:09 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-education_science]: {'accuracy': 58.620689655172406}
01/21 15:29:09 - OpenCompass - INFO - time elapsed: 6.43s
01/21 15:29:10 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:29:10 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:29:31 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-teacher_qualification]: {'accuracy': 70.45454545454545}
01/21 15:29:31 - OpenCompass - INFO - time elapsed: 10.19s
01/21 15:29:32 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:29:32 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:29:53 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-high_school_politics]: {'accuracy': 26.31578947368421}
01/21 15:29:53 - OpenCompass - INFO - time elapsed: 9.55s
01/21 15:29:54 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:29:54 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:30:30 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-high_school_geography]: {'accuracy': 47.368421052631575}
01/21 15:30:30 - OpenCompass - INFO - time elapsed: 19.33s
01/21 15:30:30 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:30:30 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:31:14 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-middle_school_politics]: {'accuracy': 52.38095238095239}
01/21 15:31:14 - OpenCompass - INFO - time elapsed: 22.77s
01/21 15:31:15 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:31:15 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:32:03 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-middle_school_geography]: {'accuracy': 58.333333333333336}
01/21 15:32:03 - OpenCompass - INFO - time elapsed: 24.21s
01/21 15:32:04 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:32:04 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
warnings.warn('User provided device_type of \'cuda\', but CUDA is not available. Disabling')
01/21 15:32:52 - OpenCompass - INFO - Task [opencompass.models.huggingface.HuggingFace_model_repos_internlm-chat-7b/ceval-modern_chinese_history]: {'accuracy': 73.91304347826086}
01/21 15:32:52 - OpenCompass - INFO - time elapsed: 24.89s
01/21 15:32:53 - OpenCompass - DEBUG - Get class `OpenICLEvalTask` from "task" registry in "opencompass"
01/21 15:32:53 - OpenCompass - DEBUG - An `OpenICLEvalTask` instance is built from registry, and its implementation can be found in opencompass.tasks.openicl_eval
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:21: UserWarning: FlashAttention only supports Ampere GPUs or newer.
warnings.warn("FlashAttention only supports Ampere GPUs or newer.")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/flash_attn_2.py:28: UserWarning: please install flash_attn from https://github.com/HazyResearch/flash-attention
warnings.warn("please install flash_attn from https://github.com/HazyResearch/flash-attention")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/colossalai/kernel/cuda_native/mha/mem_eff_attn.py:15: UserWarning: please install xformers from https://github.com/facebookresearch/xformers
warnings.warn("please install xformers from https://github.com/facebookresearch/xformers")
/root/.conda/envs/opencompass/lib/python3.10/site-packages/torch/amp/autocast_mode.py:204: UserWarning: User provided device_type of 'cuda', but