{"id":145681,"date":"2025-10-18T19:50:46","date_gmt":"2025-10-18T17:50:46","guid":{"rendered":"https:\/\/www.pauljorion.com\/blog\/?p=145681"},"modified":"2025-10-18T19:50:46","modified_gmt":"2025-10-18T17:50:46","slug":"pribor-che-contextual-hyper-embedding-uint8","status":"publish","type":"post","link":"https:\/\/www.pauljorion.com\/blog\/2025\/10\/18\/pribor-che-contextual-hyper-embedding-uint8\/","title":{"rendered":"PRIBOR : <b>CHE (Contextual Hyper-Embedding uint8)<\/b>"},"content":{"rendered":"<p class=\"p1\"><strong>CHE (Contextual Hyper-Embedding <a href=\"https:\/\/www.pauljorion.com\/blog\/2025\/10\/03\/pribor-logique-combinatoire-magique-preuve-de-concept\/\" target=\"_blank\" rel=\"noopener\">uint8<\/a>)<\/strong> est <strong>plus \u00e9conomique<\/strong> que l\u2019attention classique des LLMs. Des processus similaires sont d\u00e9j\u00e0 utilis\u00e9s mais moins \u00e9conomiques que CHE.<\/p>\n<p class=\"p1\">&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8211;<\/p>\n<h3 class=\"p1\">1. \u00c9conomie de m\u00e9moire<\/h3>\n<p class=\"p1\">\u2022<span class=\"Apple-converted-space\">\u00a0 <\/span>Attention standard : matrices float16\/float32 \u2192 700 \u00e0 4000 bits par token<\/p>\n<p class=\"p1\">\u2022<span class=\"Apple-converted-space\">\u00a0 C<\/span>HE <a href=\"https:\/\/www.pauljorion.com\/blog\/2025\/10\/03\/pribor-logique-combinatoire-magique-preuve-de-concept\/\" target=\"_blank\" rel=\"noopener\">uint8 \u2192 8 bits par token<\/a><\/p>\n<p class=\"p1\">\u2192 gain \u00d7 500 \u00e0 \u00d7 5000 en m\u00e9moire<\/p>\n<p class=\"p1\">&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8211;<\/p>\n<h3 class=\"p1\">2. Processus similaires d\u00e9j\u00e0 utilis\u00e9s<\/h3>\n<p class=\"p1\">\u2022<span class=\"Apple-converted-space\">\u00a0 <\/span>INT-FlashAttention (Peking University, 2024) : attention enti\u00e8rement en INT8, <strong>72 % plus rapide<\/strong>, <strong>82 % moins d\u2019erreur<\/strong><span class=\"Apple-converted-space\">\u00a0 \u00a0<\/span><\/p>\n<p class=\"p1\">\u2022<span class=\"Apple-converted-space\">\u00a0 <\/span>SageAttention (OpenReview, 2024) : attention en INT8 + lissage, <strong>plug-and-play<\/strong><span class=\"Apple-converted-space\">\u00a0 \u00a0<\/span><\/p>\n<p class=\"p1\">\u2022<span class=\"Apple-converted-space\">\u00a0 <\/span>LLM.int8() (NeurIPS 2022) : multiplication matricielle enti\u00e8rement en INT8<span class=\"Apple-converted-space\">\u00a0 \u00a0<\/span><\/p>\n<p class=\"p1\">\u2192 uint8 est d\u00e9j\u00e0 standard dans l\u2019attention quantifi\u00e9e.<\/p>\n<p class=\"p1\">&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8211;<\/p>\n<h3 class=\"p1\">3. Compatibilit\u00e9 avec CHE<\/h3>\n<p class=\"p1\">\u2022<span class=\"Apple-converted-space\">\u00a0 C<\/span>HE = uint8 comprim\u00e9 (SHA-256[0:8]) \u2192 8 bits par token<\/p>\n<p class=\"p1\">\u2022<span class=\"Apple-converted-space\">\u00a0 <\/span>Pas de matrice 700\u00d7700, <strong>pas de softmax<\/strong>, <strong>pas de float<\/strong> ;<\/p>\n<p class=\"p1\">\u2022<span class=\"Apple-converted-space\">\u00a0 <\/span>Juste un uint8 dans le triplet <span class=\"s1\">\u211d\u2074<\/span> ;<\/p>\n<p class=\"p1\">\u2192 Plus \u00e9conomique et d\u00e9j\u00e0 utilis\u00e9 dans l\u2019attention quantifi\u00e9e.<\/p>\n<p>Contact : <a href=\"mailto:pauljorion@pribor.ai\" target=\"_blank\" rel=\"noopener\">pauljorion@pribor.ai<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p class=\"p1\"><strong>CHE (Contextual Hyper-Embedding <a href=\"https:\/\/www.pauljorion.com\/blog\/2025\/10\/03\/pribor-logique-combinatoire-magique-preuve-de-concept\/\" target=\"_blank\" rel=\"noopener\">uint8<\/a>)<\/strong> est <strong>plus \u00e9conomique<\/strong> que l\u2019attention classique des LLMs. Des processus similaires sont d\u00e9j\u00e0 utilis\u00e9s mais moins \u00e9conomiques que CHE.<\/p>\n<p class=\"p1\">&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8212;&#8211;<\/p>\n<h3 class=\"p1\">1. \u00c9conomie de m\u00e9moire<\/h3>\n<p class=\"p1\">\u2022<span class=\"Apple-converted-space\">\u00a0 <\/span>Attention standard : matrices float16\/float32 \u2192 700 \u00e0 4000 bits par token<\/p>\n<p class=\"p1\">\u2022<span class=\"Apple-converted-space\">\u00a0 C<\/span>HE <a href=\"https:\/\/www.pauljorion.com\/blog\/2025\/10\/03\/pribor-logique-combinatoire-magique-preuve-de-concept\/\" [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_crdt_document":"","footnotes":""},"categories":[9204,13,9920],"tags":[9205,940,10408],"class_list":["post-145681","post","type-post","status-publish","format-standard","hentry","category-grands-modeles-de-langage","category-intelligence-artificielle","category-pribor","tag-grands-modeles-de-langage","tag-intelligence-artificielle-2","tag-magie-combinatoire"],"_links":{"self":[{"href":"https:\/\/www.pauljorion.com\/blog\/wp-json\/wp\/v2\/posts\/145681","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.pauljorion.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.pauljorion.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.pauljorion.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.pauljorion.com\/blog\/wp-json\/wp\/v2\/comments?post=145681"}],"version-history":[{"count":2,"href":"https:\/\/www.pauljorion.com\/blog\/wp-json\/wp\/v2\/posts\/145681\/revisions"}],"predecessor-version":[{"id":145685,"href":"https:\/\/www.pauljorion.com\/blog\/wp-json\/wp\/v2\/posts\/145681\/revisions\/145685"}],"wp:attachment":[{"href":"https:\/\/www.pauljorion.com\/blog\/wp-json\/wp\/v2\/media?parent=145681"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.pauljorion.com\/blog\/wp-json\/wp\/v2\/categories?post=145681"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.pauljorion.com\/blog\/wp-json\/wp\/v2\/tags?post=145681"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}