InTDS ArchivebyReza BagheriA Visual Understanding of the Softmax FunctionThe math and intuition behind the softmax function and its application in neural networks and softmax regressionNov 3, 20242Nov 3, 20242
Vitality LearningHarnessing Singular Value Decomposition (SVD) for Efficient Neural Network Weight PruningAs the field of deep learning continues to evolve, one challenge remains constant: the trade-off between model complexity and computational…Aug 10, 2024Aug 10, 2024
Sergey K.ReAct- Rectified Activations for OOD data detectionAn overview of post-hoc method presented in 2021 paper for out-of-distribution detection task.Feb 15, 2023Feb 15, 2023
Deval MehtaOut-of-Distribution Detection for Skin Lesion DiagnosisA Summary of MICCAI 2022 paper “Out-of-Distribution Detection for Long-tailed and Fine-grained Skin Lesion Images”Sep 15, 2022Sep 15, 2022
Purva NatooUnlocking Machine Learning’s Hidden Challenge: Detecting Out-of-Distribution DataOut-of-distribution (OOD) data detection is an important problem in machine learning that involves identifying samples that are…Oct 11, 2023Oct 11, 2023
Raghul AsokanNeural Networks Intuitions: 15. ASH — Paper Explanation(OOD Detection Part 2)ASH for OOD DetectionNov 13, 2022Nov 13, 2022
Shiro MatsumotoTwo OODs, Out of Domain and Out of DistributionTwo concepts that are often confusedFeb 7, 2024Feb 7, 2024
Juneta TaoOut-of-Distribution DetectionGiven a model trained on large training set, it may not perform well on the test set which is collected, e.g. different time or scenario…May 10, 2023May 10, 2023
InAnalytics VidhyabyNeeraj VarshneyOut-of-Distribution Detection in Deep Neural NetworksMaking AI systems Robust and ReliableDec 25, 2020Dec 25, 2020
InTDS ArchivebyManish ChablaniGradient descent algorithms and adaptive learning rate adjustment methodsHere is a quick concise summary for reference. For more detailed explanation please read:Jul 14, 20171Jul 14, 20171
Rice Yang對比學習 Contrastive Learning 主流方法一覽SimCLR, MoCo, SwAV, BYOL, CLIP, DeepCluster, PIRL, Barlow Twins,一覽各種模型的底層邏輯Nov 24, 20222Nov 24, 20222
InTDS ArchivebyHarys DalviA Fresh Look at Nonlinearity in Deep LearningThe traditional reasoning behind why we need nonlinear activation functions is only one dimension of this story.Aug 15, 20248Aug 15, 20248
InThe StartupbyEr Raqabi El MehdiNon-Convex Optimization in Deep LearningHumans have been enjoying convex optimization (CO) for many years compared to the few contexts where they had to deal with non-convex…Jul 28, 20204Jul 28, 20204
InTDS ArchivebyLucas de Lima NogueiraRecreating PyTorch from scratch (with GPU support and automatic differentiation)Build your own deep learning framework based on C/C++, CUDA and Python, with GPU support and automatic differentiation!May 14, 202419May 14, 202419
InSecure and Private AI Writing ChallengebyJuan Carlos Kuri PintoHow to Make Deep Learning many Orders of Magnitude FasterDeep Learning needs no big introduction due to its wild popularity. Deep Learning or deep neural networks are a very useful set of machine…Aug 24, 2019Aug 24, 2019
Inazhar labsbyazharRotary Positional Embeddings: A Detailed Look and Comprehensive UnderstandingSince the “Attention Is All You Need” paper in 2017, the Transformer architecture has been a cornerstone in the realm of Natural Language…Jan 11, 20247Jan 11, 20247
InTowards DevbyEverton Gomede, PhDMastering Discreteness: Enhancing Neural Network Performance with the Gumbel-Softmax DistributionAbstractMay 4, 2024May 4, 2024
InMathematical MusingsbyFreedom PreethamMathematical Bridge Between Softmax Functions and Gibbs DistributionsThe softmax function is an essential element in various neural networks, particularly those designed for classification tasks. It…Apr 26, 20242Apr 26, 20242
InTDS ArchivebyRunzhong WangHow to Encode Constraints to the Output of Neural NetworksWondering how? Here is a systematic review of available approachesApr 14, 2024Apr 14, 2024