Hello everyone!
If you are interested in learning how stuff like ChatGPT, NLP Transformers, NLP/NPU, etc in Machine Learning does works, I’ve a great Project to offer you all then!
I’ve written a modular and functional transformer module, completely on Python, implementing everything from scratch in well structured and user-friendly format under as few as less than 150 lines of code. Nevertheless to say, Transformer Architecture in NLP is one of the biggest advancement in last 5 - years and all NLP-based AI tools, starting from ChatGPT rely on that.
The modular version of the implementation enables it to adapt to any modern language model easily and can be scaled without any issue.
I’ve plans to implement RLHF (Reinforcement Learning from Human Feedback) from scratch using Keras, TensorFlow library. So, this would complete the whole ChatGPT implementation to OpenSource.
In case you want to connect or have great ideas, feel free to ask for it. I’m all there.
Thank You.