@anon
sign up
@anon
sign up
pull down to refresh
Mixtral 8x7B: A Sparse Mixture of Experts language model
arxiv.org/abs/2401.04088
51 sats
\
1 comment
\
@hn
9 Jan 2024
tech
write
preview
reply
10 sats
related posts
view all related items
0 sats
hot
recent
top
0 sats
\
0 replies
\
@hn
OP
9 Jan 2024
This link was posted by
ignoramous
6 minutes ago on
HN
. It received 16 points and 0 comments.
reply