InGong Tech BlogbyOmri AlloucheWhat We’ve Learned From Building AI-Powered Products at GongIntroductionMar 9, 20231Mar 9, 20231
InGeorgian Impact BlogbyGeorgianState of Representation Learning — ICLR 2022By: Angeline Yasodhara & Rohit SahaMay 20, 2022May 20, 2022
InCodeXbyChristianlauerPython 3.11.0 is released — Impacts to Data Science and EngineeringWhat are the Advantages of the new Release?Oct 25, 20223Oct 25, 20223
InTDS ArchivebyAparna DhinakaranThree Pitfalls To Avoid With EmbeddingsWritten in collaboration with Francisco Castillo Carrasco, data scientist at Arize AI.Jul 20, 2022Jul 20, 2022
InSyncedReviewbySyncedOpenAI Presents a Simple and Efficient Training Strategy to Boost Language Models’ Text-Infilling…Today’s transformer-based large language models (LLMs) have proven a game-changer in natural language processing, achieving…Aug 4, 2022Aug 4, 2022
Kuwar KapurGuide to Embeddings in Recommender systems with HugeCTRNVIDIA provides you with an open-source framework Merlin™ which is used for building large-scale deep learning recommender systems. In this…Aug 13, 20222Aug 13, 20222
InTDS ArchivebyMichael BronsteinTowards Geometric Deep Learning I: On the Shoulders of GiantsIn a new series of posts, we discuss how geometric ideas of symmetry underpinning Geometric Deep Learning have emerged through history.Jul 4, 2022Jul 4, 2022
HebbiaHebbia Raises $30 Million, Led by Index Ventures, to Launch the Future of SearchAnnouncing our Series ASep 27, 20222Sep 27, 20222
InSyncedReviewbySyncedGoogle Introduces RankT5: A Fine-Tuned T5 Model That Boosts Text Ranking and Zero-Shot PerformanceWhile BERT-based models have been applied to various text-ranking tasks, the potential of larger and more powerful sequence-to-sequence T5…Oct 31, 2022Oct 31, 2022