Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Benchmarks: Add Mixture of Experts Model #679

Open
wants to merge 19 commits into
base: main
Choose a base branch
from

Conversation

dpower4
Copy link
Contributor

@dpower4 dpower4 commented Dec 19, 2024

Added MoE model using MixtralConfig.

  1. Added 8x7b and 8x22b variants
  2. Requires high VRAM as all experts are loaded in memory. Thus, disabled training due to memory constraint on test worker.

@dpower4 dpower4 requested review from cp5555, guoshzhao and a team as code owners December 19, 2024 17:11
Copy link

codecov bot commented Dec 19, 2024

Codecov Report

Attention: Patch coverage is 88.05970% with 16 lines in your changes missing coverage. Please review.

Project coverage is 85.65%. Comparing base (249e21c) to head (83c9981).

Files with missing lines Patch % Lines
...nch/benchmarks/model_benchmarks/pytorch_mixtral.py 86.44% 16 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #679      +/-   ##
==========================================
+ Coverage   85.61%   85.65%   +0.04%     
==========================================
  Files          99      100       +1     
  Lines        7165     7299     +134     
==========================================
+ Hits         6134     6252     +118     
- Misses       1031     1047      +16     
Flag Coverage Δ
cpu-python3.10-unit-test 70.63% <39.84%> (-1.21%) ⬇️
cpu-python3.7-unit-test 70.00% <7.46%> (-1.81%) ⬇️
cpu-python3.8-unit-test 70.66% <40.45%> (-1.17%) ⬇️
cuda-unit-test 83.49% <87.21%> (+0.11%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@dpower4 dpower4 added benchmarks SuperBench Benchmarks micro-benchmarks Micro Benchmark Test for SuperBench Benchmarks model-benchmarks Model Benchmark Test for SuperBench Benchmarks labels Dec 31, 2024
@dpower4 dpower4 requested a review from abuccts December 31, 2024 01:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
benchmarks SuperBench Benchmarks micro-benchmarks Micro Benchmark Test for SuperBench Benchmarks model-benchmarks Model Benchmark Test for SuperBench Benchmarks
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant