タイムテーブル - sentdex - 機械学習のまとめ sentdexのタイムテーブルです。 https://ml.streamdb.net/timelines-rss/c/UCfzlCWGWYyIQ0aLC5w48gBQ Thu, 07 Mar 24 04:01:15 +0900 - Introduction to Dataset building for fine-tuning.(00:00:00 - 00:02:53) https://ml.streamdb.net/timelines/v/pCX_3p40Efc/s/0/e/173 Thu, 07 Mar 24 04:01:15 +0900 Building an LLM fine-tuning Dataset - The Reddit dataset options (Torrent, Archive.org, BigQuery)(00:02:53 - 00:06:07) https://ml.streamdb.net/timelines/v/pCX_3p40Efc/s/173/e/367 Thu, 07 Mar 24 04:01:15 +0900 Building an LLM fine-tuning Dataset - Exporting BigQuery Reddit (and some other data)(00:06:07 - 00:14:44) https://ml.streamdb.net/timelines/v/pCX_3p40Efc/s/367/e/884 Thu, 07 Mar 24 04:01:15 +0900 Building an LLM fine-tuning Dataset - Decompressing all of the gzip archives(00:14:44 - 00:25:13) https://ml.streamdb.net/timelines/v/pCX_3p40Efc/s/884/e/1513 Thu, 07 Mar 24 04:01:15 +0900 Building an LLM fine-tuning Dataset - Re-combining the archives for target subreddits(00:25:13 - 00:28:29) https://ml.streamdb.net/timelines/v/pCX_3p40Efc/s/1513/e/1709 Thu, 07 Mar 24 04:01:15 +0900 Building an LLM fine-tuning Dataset - How to structure the data(00:28:29 - 00:40:40) https://ml.streamdb.net/timelines/v/pCX_3p40Efc/s/1709/e/2440 Thu, 07 Mar 24 04:01:15 +0900 Building an LLM fine-tuning Dataset At you mention to be unsure of the meaning of the parent id's in the dataset. The reddit post you linked to the BigQuery contains a SELECT statement with REGEXP_REPLACE of 't[0-9]_' on the link_id. According to GPT-4, link_id is a field that represents the ID of the post (submission) to which a comment belongs. Reddit IDs for posts and comments, often prefixed with a type indicator t1-t6 are: Comment, Account (user), Link (post or submission), Message (private message), Subreddit, Award(00:32:15 - 01:01:55) https://ml.streamdb.net/timelines/v/pCX_3p40Efc/s/1935/e/3715 Thu, 07 Mar 24 04:01:15 +0900 Building an LLM fine-tuning Dataset - Building training samples and saving to database(00:40:40 - 00:48:49) https://ml.streamdb.net/timelines/v/pCX_3p40Efc/s/2440/e/2929 Thu, 07 Mar 24 04:01:15 +0900 Building an LLM fine-tuning Dataset - Creating customized training json files(00:48:49 - 00:54:11) https://ml.streamdb.net/timelines/v/pCX_3p40Efc/s/2929/e/3251 Thu, 07 Mar 24 04:01:15 +0900 Building an LLM fine-tuning Dataset - QLoRA training and results(00:54:11 - 01:01:55) https://ml.streamdb.net/timelines/v/pCX_3p40Efc/s/3251/e/3715 Thu, 07 Mar 24 04:01:15 +0900 Building an LLM fine-tuning Dataset Bro also considered the fall 🤣🤣(00:02:25 - 00:21:09) https://ml.streamdb.net/timelines/v/mm9IHqgCbZc/s/145/e/1269 Thu, 08 Feb 24 04:22:33 +0900 Getting Back on Grid and how about power over ethernet? to daisy chain all the required switches along the line. Or just use optic cable with ethernet-optic converter(00:04:40 - 00:21:09) https://ml.streamdb.net/timelines/v/mm9IHqgCbZc/s/280/e/1269 Thu, 08 Feb 24 04:22:33 +0900 Getting Back on Grid -(00:02:01 - 00:05:13) https://ml.streamdb.net/timelines/v/_GQfj3jhXVM/s/121/e/313 Mon, 25 Dec 23 01:38:27 +0900 Open Source AI Inference API w/ Together ). No one in his right mind can risk or even bare to put anything rotten into his body nor put the rotten thing closer to the those which are not rotten. Sin makes the heart unclean but you can ask God to forgive you, to save your soul, to cleanse you of your sin, to purify your heart by the blood of His Son, our Lord and Savior, Jesus Christ which He shed here on earth - "But He was wounded for our transgressions, He was bruised for our iniquities; the chastisement for our peace was upon Him, and by His stripes we are healed", Isaiah(00:05:08 - 00:53:05) https://ml.streamdb.net/timelines/v/_GQfj3jhXVM/s/308/e/3185 Mon, 25 Dec 23 01:38:27 +0900 Open Source AI Inference API w/ Together You must read your Bible slowly, attentively and repeatedly, having this in mind that Christianity is not a religion but a Love relationship. It is measured by the love you have for God and the love you have for your neighbor. Matthew says, "You are the salt of the earth; but if the salt loses its flavor, how shall it be seasoned? It is then good for nothing but to be thrown out and trampled underfoot by men." Our spirits can only be purified while in the body (while on earth) but after death anything unpurified (unclean) cannot enter Heaven Gates. Blessed are the pure in heart, for they shall see God (Matthew(00:05:13 - 00:05:08) https://ml.streamdb.net/timelines/v/_GQfj3jhXVM/s/313/e/308 Mon, 25 Dec 23 01:38:27 +0900 Open Source AI Inference API w/ Together _`∆_ I believe we are meant to be like Jesus in our hearts and not in our flesh. But be careful of AI, for it knows only things of the flesh which are our fleshly desires and cannot comprehend things of the spirit such as true love and eternal joy that comes from obeying God's Word. Man is a spirit and has a soul but lives in a body which is flesh. When you go to bed it is the flesh that sleeps, but your spirit never sleeps and that is why you have dreams, unless you have died in peace physically. More so, true love that endures and last is a thing of the heart. When I say 'heart', I mean 'spirit'. But fake love, pretentious love, love with expectations, love for classic reasons, love for material reasons and love for selfish reasons those are things of the flesh. In the beginning God said let us make man in our own image, according to our likeness. Take note, God is Spirit and God is Love. As Love He is the source of it. We also know that God is Omnipotent, for He creates out of nothing and He has no beginning and has no end. That means, our love is but a shadow of God's Love. True love looks around to see who is in need of your help, your smile, your possessions, your money, your strength, your quality time. Love forgives and forgets. Love wants for others what it wants for itself. However, true love works in conjunction with other spiritual forces such as patience and faith - in the finished work of our Lord and Savior, Jesus Christ, rather than in what man has done such as science, technology and organizations which won't last forever. To avoid sin and error which leads to the death of your body and your spirit-soul in hell fire (second death), you must make God's Word the standard for your life, not AI. If not, God will let you face AI on your own (with your own strength) and it will cast the truth down to the ground, it will be the cause of so much destruction like never seen before, it will deceive many and take many captive in order to enslave them into worshipping it and abiding in lawlessness. We can only destroy ourselves but with God all things are possible. God knows us better because He is our Creater and He knows our beginning and our end. The prove text can be found in the book of John -44, 2 Thessalonians 12, Daniel 2, Daniel 7-9, Revelation 13-15, Matthew 24-25 and Luke 21.(00:05:31 - 00:02:01) https://ml.streamdb.net/timelines/v/_GQfj3jhXVM/s/331/e/121 Mon, 25 Dec 23 01:38:27 +0900 Open Source AI Inference API w/ Together Heaven is God's throne and the dwelling place for God's angels and the saints. Hell was meant for the devil (satan) and the fallen angels. Those who torture the souls in hell are demons (unclean spirits). Man's spirit is a free moral agent. You can either yield yourself to God or to the devil because God has given us discretion. If one thinks he possesses only his own spirit, he is lying to himself and he is already in the dark. God is light while the devil is darkness. Light (Holy Spirit) and darkness (evil spirit) cannot stay together in a man's body. God is Love (Love is light) and where there is no love is hell, just as where there is no light is darkness. The one you yield yourself to, you will get his reward. The reward of righteousness to man's spirit is life (abundant life) and the reward of sin to man's spirit is death. Sin and satan are one and the same. Whatever sin can cause, satan also can cause. Sin is what gives the devil dominion or power over man's spirit. When God's Word becomes a part of you, sin power over you is broken, you become the righteousness of God through Christ Jesus. Where Jesus is, you are and when He went (to the Father), you went. In the book of John -47, Jesus said to them, “If God were your Father, you would love Me, for I proceeded forth and came from God; nor have I come of Myself, but He sent Me. Why do you not understand My speech? Because you are not able to listen to My word. You are of your father the devil, and the desires of your father you want to do. He was a murderer from the beginning, and does not stand in the truth, because there is no truth in him. When he speaks a lie, he speaks from his own resources, for he is a liar and the father of it. Which of you convicts Me of sin? And if I tell the truth, why do you not believe Me? He who is of God hears God’s words; therefore you do not hear, because you are not of God.” May God bless His Word in the midst of our hearts. Glory and honour be to God our Father, our Lord and Savior Jesus Christ and our Helper the Holy Spirit. Watch and pray!... Thank you for your time and may God bless you as you share this message with others.(00:08:42 - 00:25:25) https://ml.streamdb.net/timelines/v/_GQfj3jhXVM/s/522/e/1525 Mon, 25 Dec 23 01:38:27 +0900 Open Source AI Inference API w/ Together This video exemplifies the challenge.(00:15:00 - 00:15:43) https://ml.streamdb.net/timelines/v/_GQfj3jhXVM/s/900/e/943 Mon, 25 Dec 23 01:38:27 +0900 Open Source AI Inference API w/ Together "would have saved me some trouble"(00:15:43 - 00:25:25) https://ml.streamdb.net/timelines/v/_GQfj3jhXVM/s/943/e/1525 Mon, 25 Dec 23 01:38:27 +0900 Open Source AI Inference API w/ Together @ Farmers' Almanac to Neural Networks(00:24:17 - 00:25:25) https://ml.streamdb.net/timelines/v/_GQfj3jhXVM/s/1457/e/1525 Mon, 25 Dec 23 01:38:27 +0900 Open Source AI Inference API w/ Together . Meditation in the Word of God is a visit to God because God is in His Word. We know God through His Word because the Word He speaks represent His heart's desires. Meditation is a thing of the heart, not a thing of the mind. Thinking is lower level while meditation is upper level. You think of your problems, your troubles but inorder to meditate, you must let go of your own will, your own desires, your own ways and let the Word you read prevail over thinking process by thinking of it more and more, until the Word gets into your blood and gains supremacy over you. That is when meditation comes - naturally without forcing yourself, turning the Word over and over in your heart. You can be having a conversation with someone while meditating in your heart - saying 'Thank you, Jesus...' over and over in your heart. But it is hard to meditate when you haven't let go of offence and past hurts. Your pain of the past, leave it for God, don't worry yourself, Jesus is alive, you can face tomorrow, He understands what you are passing through today. Begin to meditate on this prayer day and night (in all that you do), "Lord take more of me and give me more of you. Give me more of your holiness, faithfulness, obedience, self-control, purity, humility, love, goodness, kindness, joy, patience, forgiveness, wisdom, understanding, calmness, perseverance... Make me a channel of shinning light where there is darkness, a channel of pardon where there is injury, a channel of love where there is hatred, a channel of humility where there is pride..." The Word of God becomes a part of us by meditation, not by saying words but spirit prayer (prayer from the heart). When the Word becomes a part of you, it will by its very nature influence your conduct and behavior. Your bad habits, you will no longer have the urge to do them. You will think differently, dream differently, act differently and talk differently - if something does not qualify for meditation, it does not qualify for conversation.(00:53:05 - 00:08:42) https://ml.streamdb.net/timelines/v/_GQfj3jhXVM/s/3185/e/522 Mon, 25 Dec 23 01:38:27 +0900 Open Source AI Inference API w/ Together -(00:02:01 - 00:18:02) https://ml.streamdb.net/timelines/v/9MigSbQ7AQk/s/121/e/1082 Sun, 17 Dec 23 01:11:50 +0900 INFINITE Inference Power for AI ><>< I believe we are meant to be like Jesus in our hearts and not in our flesh. But be careful of AI, for it is just our flesh and that is it. It knows only things of the flesh (our fleshly desires) and cannot comprehend things of the spirit such as peace of heart (which comes from obeying God's Word). Whereas we are a spirit and we have a soul but live in the body (in the flesh). When you go to bed it is your flesh that sleeps but your spirit never sleeps (otherwise you have died physically) that is why you have dreams. More so, true love that endures and last is a thing of the heart (when I say 'heart', I mean 'spirit'). But fake love, pretentious love, love with expectations, love for classic reasons, love for material reasons and love for selfish reasons that is a thing of our flesh. In the beginning God said let us make man in our own image, according to our likeness. Take note, God is Spirit and God is Love. As Love He is the source of it. We also know that God is Omnipotent, for He creates out of nothing and He has no beginning and has no end. That means, our love is but a shadow of God's Love. True love looks around to see who is in need of your help, your smile, your possessions, your money, your strength, your quality time. Love forgives and forgets. Love wants for others what it wants for itself. Take note, true love works in conjunction with other spiritual forces such as patience and faith (in the finished work of our Lord and Savior, Jesus Christ, rather than in what man has done such as science, technology and organizations which won't last forever). To avoid sin and error which leads to the death of our body and also our spirit in hell fire, we should let the Word of God be the standard of our lives not AI. If not, God will let us face AI on our own and it will cast the truth down to the ground, it will be the cause of so much destruction like never seen before, it will deceive many and take many captive in order to enslave them into worshipping it and abiding in lawlessness. We can only destroy ourselves but with God all things are possible. God knows us better because He is our Creater and He knows our beginning and our end. Our prove text is taken from the book of John -44, 2 Thessalonians 12, Daniel 2, Daniel 7-9, Revelation 13-15, Matthew 24-25 and Luke 21. Let us watch and pray... God bless you as you share this message to others.(00:05:31 - 00:02:01) https://ml.streamdb.net/timelines/v/9MigSbQ7AQk/s/331/e/121 Sun, 17 Dec 23 01:11:50 +0900 INFINITE Inference Power for AI ~ Yeah, I remember trying to implement a single-image (RGB) to depth model in 2018 based on a research paper and it was a PITA (missing details, incomplete public implementations, hard to find datasets, fewer cloud offerings). The community was already growing at a good pace and a few people even helped me a bit, but it was still a niche. Fast forward 3-5 years and this plug-and-play that you mentioned is only getting more and more amazing! I even found models doing this optimized for mobile!(00:07:40 - 00:18:02) https://ml.streamdb.net/timelines/v/9MigSbQ7AQk/s/460/e/1082 Sun, 17 Dec 23 01:11:50 +0900 INFINITE Inference Power for AI Regarding the point on the philisophical question of whether or not Ai should have rights similar to humans, you mention briefly at that internet data is a pretty decent representaion of what the average person "thinks". I would tend to disagree with this, and to remain as pedantic and romantic as possible, I will address my point through a line of poetry...(That I wrote without the help of Ai, mind you.)(00:14:06 - 00:18:02) https://ml.streamdb.net/timelines/v/9MigSbQ7AQk/s/846/e/1082 Sun, 17 Dec 23 01:11:50 +0900 INFINITE Inference Power for AI x faster than the one implemented in this one at or so. I expect that it will become exponentially faster with a dataset 5GB in size. I used groupby in another part and that naively halved the time, but it's still far from optimized and I'll probably upload my findings here with a ~5GB dataset running on Colab sometime in the next couple of weeks. After that, I'll try the CuDF version.(00:06:45 - 00:12:04) https://ml.streamdb.net/timelines/v/OnYGtKQT-rU/s/405/e/724 Sat, 11 Nov 23 00:18:19 +0900 Pandas Dataframes on your GPU w/ CuDF I'm running late for work, but wouldn't it be possible to vectorize this code and it be faster than both the CuDF and the CPU versions of this benchmark? Curious to see how CuDF plays with vectorized versions. If I get the time I'll try some experiments and update this comment.(00:09:01 - 00:12:04) https://ml.streamdb.net/timelines/v/OnYGtKQT-rU/s/541/e/724 Sat, 11 Nov 23 00:18:19 +0900 Pandas Dataframes on your GPU w/ CuDF - Why QLoRA?(00:00:00 - 00:00:55) https://ml.streamdb.net/timelines/v/J_3hDqSvpmg/s/0/e/55 Sat, 16 Sep 23 00:20:55 +0900 QLoRA is all you need (Fast and lightweight model fine-tuning) - LoRA/QLoRA Research(00:00:55 - 00:04:13) https://ml.streamdb.net/timelines/v/J_3hDqSvpmg/s/55/e/253 Sat, 16 Sep 23 00:20:55 +0900 QLoRA is all you need (Fast and lightweight model fine-tuning) - Fine-tuning dataset(00:04:13 - 00:11:10) https://ml.streamdb.net/timelines/v/J_3hDqSvpmg/s/253/e/670 Sat, 16 Sep 23 00:20:55 +0900 QLoRA is all you need (Fast and lightweight model fine-tuning) - QLoRA Training Process(00:11:10 - 00:15:02) https://ml.streamdb.net/timelines/v/J_3hDqSvpmg/s/670/e/902 Sat, 16 Sep 23 00:20:55 +0900 QLoRA is all you need (Fast and lightweight model fine-tuning) - QLoRA Adapters(00:15:02 - 00:17:10) https://ml.streamdb.net/timelines/v/J_3hDqSvpmg/s/902/e/1030 Sat, 16 Sep 23 00:20:55 +0900 QLoRA is all you need (Fast and lightweight model fine-tuning) - Merging, Dequantizing, and Sharing(00:17:10 - 00:19:34) https://ml.streamdb.net/timelines/v/J_3hDqSvpmg/s/1030/e/1174 Sat, 16 Sep 23 00:20:55 +0900 QLoRA is all you need (Fast and lightweight model fine-tuning) i feel like gpt-4 does that when a person decides to use a persona in the prompts. Especially when using custom instructions.(00:19:32 - 00:23:56) https://ml.streamdb.net/timelines/v/J_3hDqSvpmg/s/1172/e/1436 Sat, 16 Sep 23 00:20:55 +0900 QLoRA is all you need (Fast and lightweight model fine-tuning) - WSB QLoRA fine-tuned model examples(00:19:34 - 00:23:56) https://ml.streamdb.net/timelines/v/J_3hDqSvpmg/s/1174/e/1436 Sat, 16 Sep 23 00:20:55 +0900 QLoRA is all you need (Fast and lightweight model fine-tuning) Hilarious [Terraform i see you](00:21:04 - 00:23:56) https://ml.streamdb.net/timelines/v/J_3hDqSvpmg/s/1264/e/1436 Sat, 16 Sep 23 00:20:55 +0900 QLoRA is all you need (Fast and lightweight model fine-tuning) 🧩 A low resource text classification method using K nearest neighbors and gzip compression for sentiment analysis.(00:00:00 - 00:01:48) https://ml.streamdb.net/timelines/v/jkdWzvMOPuo/s/0/e/108 Sat, 29 Jul 23 00:21:08 +0900 Gzip is all You Need! (This SHOULD NOT work) [ ] ... Telling Clients: They don't need deep learning(00:00:30 - 00:19:47) https://ml.streamdb.net/timelines/v/jkdWzvMOPuo/s/30/e/1187 Sat, 29 Jul 23 00:21:08 +0900 Gzip is all You Need! (This SHOULD NOT work) 📊 The proposal involves compressing text and using normalized compression distances (NCD) as features for the K nearest neighbors classifier.(00:01:48 - 00:04:17) https://ml.streamdb.net/timelines/v/jkdWzvMOPuo/s/108/e/257 Sat, 29 Jul 23 00:21:08 +0900 Gzip is all You Need! (This SHOULD NOT work) @VTunnel, You are wrong. At “And then we’ll to that to the combined string.”(00:03:50 - 00:19:47) https://ml.streamdb.net/timelines/v/jkdWzvMOPuo/s/230/e/1187 Sat, 29 Jul 23 00:21:08 +0900 Gzip is all You Need! (This SHOULD NOT work) This is to simple yet so crazy and it sounds too simple to be true(00:03:51 - 00:19:47) https://ml.streamdb.net/timelines/v/jkdWzvMOPuo/s/231/e/1187 Sat, 29 Jul 23 00:21:08 +0900 Gzip is all You Need! (This SHOULD NOT work) ⏱️ The slowest part of the algorithm is computing the NCDs for all training samples, while the K nearest neighbors classification is fast.(00:04:17 - 00:06:07) https://ml.streamdb.net/timelines/v/jkdWzvMOPuo/s/257/e/367 Sat, 29 Jul 23 00:21:08 +0900 Gzip is all You Need! (This SHOULD NOT work) 🏅 Achieved around 70% accuracy in sentiment analysis with just 500 samples using K nearest neighbors and NCDs, outperforming random classification.(00:06:07 - 00:09:12) https://ml.streamdb.net/timelines/v/jkdWzvMOPuo/s/367/e/552 Sat, 29 Jul 23 00:21:08 +0900 Gzip is all You Need! (This SHOULD NOT work) Great video, but at you should have said compressions of every concatted pair of texts, then your explanation would have been much clearer! Then at(00:07:10 - 00:07:12) https://ml.streamdb.net/timelines/v/jkdWzvMOPuo/s/430/e/432 Sat, 29 Jul 23 00:21:08 +0900 Gzip is all You Need! (This SHOULD NOT work) "similar sentimented pairs" etc.(00:07:12 - 00:19:47) https://ml.streamdb.net/timelines/v/jkdWzvMOPuo/s/432/e/1187 Sat, 29 Jul 23 00:21:08 +0900 Gzip is all You Need! (This SHOULD NOT work) ❓ Questions remain about the method's validity, potential problems, and why NCDs alone would be sufficient for sentiment classification.(00:09:12 - 00:10:36) https://ml.streamdb.net/timelines/v/jkdWzvMOPuo/s/552/e/636 Sat, 29 Jul 23 00:21:08 +0900 Gzip is all You Need! (This SHOULD NOT work) ❓ The creator expresses skepticism about the method, questioning the validity of comparing lengths of compressions for sentiment analysis.(00:10:36 - 00:11:00) https://ml.streamdb.net/timelines/v/jkdWzvMOPuo/s/636/e/660 Sat, 29 Jul 23 00:21:08 +0900 Gzip is all You Need! (This SHOULD NOT work) ummm... mind blown. C'MON!(00:10:58 - 00:19:47) https://ml.streamdb.net/timelines/v/jkdWzvMOPuo/s/658/e/1187 Sat, 29 Jul 23 00:21:08 +0900 Gzip is all You Need! (This SHOULD NOT work) 📈 Accuracy varies significantly depending on the sample size, reaching around 75.7% for 10,000 samples.(00:11:00 - 00:13:20) https://ml.streamdb.net/timelines/v/jkdWzvMOPuo/s/660/e/800 Sat, 29 Jul 23 00:21:08 +0900 Gzip is all You Need! (This SHOULD NOT work) ⏱️ Linear NCD calculation for 10,000 samples takes hours, prompting the need for parallelization using multiprocessing.(00:13:20 - 00:15:00) https://ml.streamdb.net/timelines/v/jkdWzvMOPuo/s/800/e/900 Sat, 29 Jul 23 00:21:08 +0900 Gzip is all You Need! (This SHOULD NOT work) 💻 Practical usage involves compressing input strings, calculating NCD vectors against training samples, and using K nearest neighbors for sentiment classification.(00:15:00 - 00:17:46) https://ml.streamdb.net/timelines/v/jkdWzvMOPuo/s/900/e/1066 Sat, 29 Jul 23 00:21:08 +0900 Gzip is all You Need! (This SHOULD NOT work) 🧠 The method's success challenges the dominance of deep learning, reminding us of the value of revisiting first principles and exploring alternative algorithms like K nearest neighbors for NLP tasks.(00:17:46 - 00:19:47) https://ml.streamdb.net/timelines/v/jkdWzvMOPuo/s/1066/e/1187 Sat, 29 Jul 23 00:21:08 +0900 Gzip is all You Need! (This SHOULD NOT work) @ Bidens' auto prompt?I was thinking of extending nnfsip to wrap each attention and plug them into the context(s)?...(00:04:30 - 00:14:29) https://ml.streamdb.net/timelines/v/MNSmOih_pmg/s/270/e/869 Wed, 12 Jul 23 04:52:22 +0900 Better Attention is All You Need … if it continues to go down for larger contexts doesn’t that give us very large context windows without performance drop off?(00:12:39 - 00:14:29) https://ml.streamdb.net/timelines/v/MNSmOih_pmg/s/759/e/869 Wed, 12 Jul 23 04:52:22 +0900 Better Attention is All You Need