Roman SinTowards Data ScienceBit-LoRA as an application of BitNet and 1.58 bit neural network technologiesAbstract: applying ~1bit transformer technology to LoRA adapters allows us to reach comparable performance with full-precision LoRA…Jun 32Jun 32
Roman SinExness Tech BlogHow We Used Transformers Neural Networks to Improve Time Series PredictionsIn this article, we will be discussing the use of transformer models for working with time series data. The information presented in this…Jan 24, 20232Jan 24, 20232
Roman SinExness Tech BlogComparing ViT and EfficientNet in terms of image classification problemsAiming to create a filter on the quality of renovations for rental property ads, we conducted the following research and compared the two…Oct 3, 2022Oct 3, 2022