- Creating an SVM from scratch - Practical Machine Learning Tutorial with Python p.25

Creating an SVM from scratch - Practical Machine Learning Tutorial with Python p.25

Welcome to the 25th part of our machine learning tutorial series and the next part in our Support Vector Machine section. In this tutorial, we're going to begin setting up or own SVM from scratch.

Before we dive in, however, I will draw your attention to a few other options for solving this cons...
Welcome to the 25th part of our machine learning tutorial series and the next part in our Support Vector Machine section. In this tutorial, we're going to begin setting up or own SVM from scratch.

Before we dive in, however, I will draw your attention to a few other options for solving this constraint optimization problem:

First, the topic of constraint optimization is massive, and there is quite a bit of material on the subject. Even just our subsection: Convex Optimization, is massive. A starting place might be: https://web.stanford.edu/~boyd/cvxbook/bv_cvxbook.pdf. For a starting place for constraint optimization in general, you could also check out http://www.mit.edu/~dimitrib/Constrained-Opt.pdf

Within the realm of Python specifically, the CVXOPT package has various convex optimization methods available, one of which is the quadratic programming problem we have (found @ cvxopt.solvers.qp).

Also, even more specifically there is libsvm's Python interface, or the libsvm package in general. We are opting to not make use of any of these, as the optimization problem for the Support Vector Machine IS basically the entire SVM problem.

Now, to begin our SVM in Python.

https://pythonprogramming.net

https://www.facebook.com/pythonprogramming.net/
https://plus.google.com/+sentdex

#support vector machine #svm #machine learning #python #classification #artificial intelligence #tutorial
00:00:00 - 00:00:02: 皆さん、何が起こっているのでしょうか。 00:00:02 - 00:00:05: 機械学習チュートリアル シリーズのパート 25 へようこそ。 00:00:05 - 00:00:07: このパートでは、 00:00:07 - 00:00:09: 特にサポート ベクター マシンについて話します。 00:00:09 - 00:00:12: これまで、 00:00:12 - 00:00:15: どのように行うかについての理論とロジックを説明してきました。 00:00:15 - 00:00:16: それで、 00:00:20 - 00:00:22: ここでは理論とロジックについてはあまり話さないので、それを見逃した場合は、 00:00:22 - 00:00:23: 戻って 00:00:23 - 00:00:26: 説明する必要があるので、とにかく先に進んで 00:00:26 - 00:00:28: 取得しましょう 最初に 00:00:28 - 00:00:29: 取得することは、先に進み、 00:00:29 - 00:00:33: マットプロットをPLTとしてパイプロットにインポートすることです。matplotlib 00:00:33 - 00:00:36: からインポート 00:00:36 - 00:00:42: スタイルスタイルを使用し、これにはggplotを使用します それは 00:00:42 - 00:00:44: 問題ではありません。 00:00:44 - 00:00:46: 実際には、numpy を MP としてインポートする必要もあります。 00:00:46 - 00:00:50: 基本的に、 00:00:50 - 00:00:54: 最初に a に必要なのはこれだけです。それで、 00:00:55 - 00:00:58: 非常に単純な基本データから始めます。 00:00:59 - 00:01:02: 現時点ではこれをデータ辞書と呼びます。これは辞書にすることができ 00:01:03 - 00:01:06: 、キーはクラスになるため、 00:01:06 - 00:01:08: 負の 1 は 00:01:08 - 00:01:15: リストのリストの numpy 配列と同等になり、 00:01:15 - 00:01:16: 同じことを行います 00:01:16 - 00:01:22: 正の 1 クラスの場合は、リスト 00:01:22 - 00:01:25: のリストにもなりますので、ここにカンマを入れて、 00:01:28 - 00:01:31: ここに何かを追加し始めましょう。それで、1 7 が得られたとしましょう。1 00:01:31 - 00:01:34: 7 が得られたので、ペーストしてみましょう。 00:01:34 - 00:01:38: それを 8 にして、3 8 を実行します。 00:01:38 - 00:01:40: 本当に単純なことです。それから、 00:01:40 - 00:01:43: これを入力します。もし 00:01:43 - 00:01:45: 詰まったら、このように貼り付けてください。 00:01:45 - 00:01:52: そうしたら、5 1 を実行しましょう 00:01:52 - 00:01:55: 。6 を実行しましょう。 マイナスの 1 なので、 00:01:55 - 00:01:57: それを示すためにそこにマイナスのようなものを入れて、 00:01:57 - 00:02:02: 7 おっと 7 3 を入れます。 00:02:02 - 00:02:05: 非常に明確に、その 00:02:05 - 00:02:08: データがグラフのどこに移動しているかが明確になるはずです。そうでない場合は、 00:02:08 - 00:02:10: それは実際にはそれほど整数ではありません 00:02:10 - 00:02:12: 私たちはそれを訓練しました、そうします、 00:02:12 - 00:02:13: 計画を立てて、 00:02:13 - 00:02:15: いつもそれを今すぐやるつもりですので、 00:02:15 - 00:02:19: 今私たちがやろうとしていることは、 00:02:19 - 00:02:21: おそらくそれをそのままにして、 00:02:21 - 00:02:22: それから降ります 00:02:22 - 00:02:24: 他の部分が好むものに、後で 00:02:24 - 00:02:26: 実際に実行するものを呼び出すことになりますが、実際には 00:02:27 - 00:02:30: データ ディック以外に何も持っていないので、先に進んで 00:02:30 - 00:02:32: サポート ベクター マシン 00:02:32 - 00:02:35: クラスの構築を開始しましょう。 オブジェクト指向プログラミングのクラスを完全に理解する 00:02:38 - 00:02:40: これが十分 00:02:40 - 00:02:44: に単純なクラスであることを実際には期待していない ただ 00:02:44 - 00:02:47: self という単語はクラス 00:02:47 - 00:02:50: 内で変数や他のメソッドを使用できるようにするだけであることを理解してください メソッドとは一体何なのか 00:02:50 - 00:02:51: 、メソッドとは基本的にどのようなものなのかを理解してください 00:02:53 - 00:02:55: 関数と同じように機能し、関数などのように動作します。 00:02:57 - 00:02:59: オブジェクトやプログラミングについてあまり詳しくない場合は、そのように考えてください。 00:03:00 - 00:03:02: ただし、サポート 00:03:02 - 00:03:05: ベクター マシンを オブジェクトなので、 00:03:05 - 00:03:08: トレーニングしてから再トレーニングする必要がなく、 00:03:08 - 00:03:10: 後で予測などを実行したり 00:03:10 - 00:03:12: 、視覚化したりできるので、 00:03:12 - 00:03:15: それを 00:03:15 - 00:03:17: オブジェクトにし、K 個の最近傍とともに保存したいのです。 00:03:18 - 00:03:21: すべての 00:03:21 - 00:03:24: 予測は基本的に 00:03:24 - 00:03:27: 再トレーニングする必要があるため、オブジェクトにすることはそれほど価値がありませんでした。そのため、実際にはそれほど 00:03:27 - 00:03:28: 価値がありませんでした。とにかく、 00:03:28 - 00:03:33: これをクラスサポートベクターマシンと呼ぶことにします。 00:03:33 - 00:03:34: 実際には、おそらく完全に公式にする必要があります 00:03:34 - 00:03:38: サポート ベクター マシンです。 00:03:39 - 00:03:43: それでは、init を定義します。ここでも 00:03:43 - 00:03:46: 説明していないと思います 00:03:46 - 00:03:48: が、基本的にクラス内のすべてのメソッドは、 00:03:50 - 00:03:52: すべてのメソッドを実行するためにクラスを呼び出すと、どのメソッドも実行され 00:03:52 - 00:03:54: ません。 init メソッドを除いて実行します。これは 00:03:54 - 00:03:56: 初期化メソッドです。そのため、 00:03:56 - 00:03:58: 最初にサポート ベクター マシンを呼び出して 00:03:58 - 00:04:01: 動作させると、init メソッドが 00:04:03 - 00:04:05: 実行されるだけなので、そこで定義したものはすべて実行され、その他の処理は、 00:04:05 - 00:04:07: 特に呼び出した場合にのみ実行されます。 00:04:07 - 00:04:09: ここには self があり、 00:04:09 - 00:04:12: 視覚化を 00:04:12 - 00:04:16: true に設定します。ここで起こっていることは、self の 00:04:16 - 00:04:18: ための self であり、さらに 00:04:18 - 00:04:20: 視覚化も行います。 00:04:22 - 00:04:25: データをグラフ化できるようにするコードがいくつかあります。 そうでないかもしれませんが、私たちが 00:04:26 - 00:04:28: できるだけハイパーになりたい場合は、それを視覚化しないでしょうが、 00:04:32 - 00:04:34: 私はそれをそこに置きたかっただけです。 00:04:34 - 00:04:36: それを目で見て 00:04:36 - 00:04:39: 視覚化するので、 00:04:39 - 00:04:43: セルフドット視覚化は 00:04:43 - 00:04:46: 視覚化に等しいかどうかを判断します。つまり、 00:04:46 - 00:04:49: クラス全体に対して、その 00:04:49 - 00:04:50: 値をユーザーが言ったことに設定するだけです。 00:04:50 - 00:04:52: ユーザーは何も言わなかったので、 00:04:52 - 00:04:54: それが真実であると仮定しているだけなので、 00:04:54 - 00:04:56: 次は視覚化します。 00:04:56 - 00:04:59: セルフドットカラーを実行します。おそらく 00:04:59 - 00:05:01: ここではこれは必要ありません。視覚化を入れるかもしれません 00:05:01 - 00:05:02: が、今のところは これをここに置きます。 00:05:05 - 00:05:07: 物事を視覚化している場合、これが私たちが望んでいることです。 00:05:07 - 00:05:13: つまり、a1 であるクラスは 00:05:13 - 00:05:15: 赤になり、負のクラスは 00:05:15 - 00:05:19: 青になります。次に私たちがやろうとしていることは、次のとおりです。 00:05:19 - 00:05:25: 自己ドットの可視化の場合、 00:05:25 - 00:05:26: その場合は、図に 00:05:26 - 00:05:29: 自己ドット Fink が P Ltd の数値と等しいと言うつもりで、 00:05:29 - 00:05:32: その後、自己 00:05:32 - 00:05:36: ドット軸は自己ドット fig dot 00:05:36 - 00:05:41: アンダースコア サブプロット 1 つ 1 を追加すると言うつもりです。 00:05:41 - 00:05:42: matplotlib についてはあまり詳しくありません。 00:05:42 - 00:05:44: チュートリアル シリーズ全体を用意していますので、 00:05:44 - 00:05:45: 基本的にこれは 00:05:45 - 00:05:47: ほぼウィンドウ全体の実際の図であり、 00:05:47 - 00:05:49: 次にこれが 00:05:49 - 00:05:51: 特定のサブプロットであり、次にこれが 1 つずつの 00:05:51 - 00:05:54: グリッドであることを確認してください。 これがプロット 1 です。プロットは 00:05:54 - 00:05:56: 1 つしかありませんが、 00:05:56 - 00:05:58: 1 つあればプロット 1 です。どうせもっとあるなら、 00:06:00 - 00:06:03: それが必要です。理由は後ほど説明しますので 00:06:03 - 00:06:05: 、それを共有する必要があります。 00:06:05 - 00:06:08: メソッド全体にわたって、とにかく 00:06:08 - 00:06:10: init を取得したら、次に 00:06:10 - 00:06:14: 必要なことは、最終的に 00:06:14 - 00:06:17: 他に何が必要かを取得することです。したがって、 00:06:18 - 00:06:21: scikit-learn と同じように考える必要があるのは間違いありません。 ありますか、 00:06:21 - 00:06:25: フィット感と予測は確かにあります。 00:06:25 - 00:06:28: それでは、先に進んでフィット感を定義しましょう。 00:06:31 - 00:06:32: 変数などを共有できるように、そこに自分自身を用意する必要があります。 00:06:32 - 00:06:34: しかし、他に何が必要になるのでしょうか。 00:06:34 - 00:06:37: これは標準に関するものです 00:06:37 - 00:06:38: が、とにかく 00:06:38 - 00:06:40: 他に何が必要ですか。 00:06:40 - 00:06:41: 人間にうまく適合したときに何をするかなどです。 00:06:41 - 00:06:43: データを渡しているので、 00:06:43 - 00:06:45: そこにデータを投入するつもりです。今のところ、 00:06:45 - 00:06:46: パスするつもりです。 00:06:46 - 00:06:48: 他のことをするつもりです、そして 00:06:48 - 00:06:50: 最後に、本当に必要となる他の方法は何ですか、そうですね、 00:06:51 - 00:06:55: 間違いなく予測が必要になります、そして 00:06:55 - 00:06:58: それは再びデータと予測を自分自身に取り込むことになります、 00:06:58 - 00:07:01: うーん、先に進みましょう 00:07:01 - 00:07:03: 実際に予測を記入することは 00:07:03 - 00:07:06: ほとんどできます 予測とは 00:07:06 - 00:07:09: 何か 関数が何であるか、 00:07:09 - 00:07:11: 予測の計算が何であるかを覚えておく必要があります 00:07:11 - 00:07:14: 基本的に、それは分類に 00:07:14 - 00:07:18: 等しいものであり、思い出していただければ、それは 00:07:18 - 00:07:25: X I の符号です 00:07:27 - 00:07:28: ここでは簡単にサブルーチンを実行できないため、 00:07:28 - 00:07:34: X ドット W と B を加えます。 00:07:34 - 00:07:38: この方程式の符号が何であれ、 00:07:38 - 00:07:41: Python ではどうなるでしょうか。 どうすれば 00:07:41 - 00:07:44: これをうまく行うことができますか? まず第一に、 00:07:44 - 00:07:46: 記号について、 00:07:46 - 00:07:48: 簡単なラムダ関数のようなもの、または 00:07:48 - 00:07:51: 単にゼロ以上またはゼロ以下のセットであることを示すものを用意して、 00:07:52 - 00:07:54: そのようなことを行うことができますが、それは変わります out 00:07:54 - 00:07:57: numpy は実際には特定の符号を持っているので、 00:07:57 - 00:07:58: 先に進み、 00:07:58 - 00:08:04: n P ドットの NP ドットサインを実行します。これは 00:08:04 - 00:08:09: X ドット W であり、ドット積とは何かを覚えておいてください。 00:08:09 - 00:08:14: スカラー値を返すので、NP ドットと 00:08:14 - 00:08:18: 私たちは 実際に渡すもの 00:08:18 - 00:08:20: と Web サイトにドットを付けるつもりです。 00:08:20 - 00:08:23: 自己データを予測したことを見てみましょう。自己の 00:08:23 - 00:08:25: 特徴を予測するとします。それはもう少し 00:08:25 - 00:08:27: 理にかなっているので、MP ドットを実行してから、 00:08:27 - 00:08:30: NP または a 00:08:30 - 00:08:35: features そして、それはちょうど in 00:08:35 - 00:08:38: the dot の中にあるので、これがドットを打つ 2 つのことです。これ 00:08:39 - 00:08:40: までに 1 つのことを理解しているので、 00:08:40 - 00:08:42: コンマを入力し、次にもう 1 つのことを入力します。 00:08:42 - 00:08:43: ドットに行きますが、これは自己 00:08:43 - 00:08:47: ドットになります W まだ W がありません、それが 00:08:47 - 00:08:48: この全体の要点であることを付け加えます。覚えておいてください。 00:08:48 - 00:08:51: 私たちはそこに到達しますが、 00:08:51 - 00:08:54: それがそうなることであることを覚えておいてください。 00:08:54 - 00:08:57: それは単に X W に B を加えただけなので、 00:08:57 - 00:08:59: もう一度自己ドット B と同じ話です。 00:08:59 - 00:09:01: まだそれらを持っていませんが、それらを最適化する必要があります 00:09:01 - 00:09:03: が、それらを取得したら、予測を行うことができるようになります。 00:09:03 - 00:09:06: どこで 00:09:06 - 00:09:07: 行うと思いますか? これらの値を取得します。 00:09:07 - 00:09:09: いつそれらを適切に設定する 00:09:09 - 00:09:11: つもりですか。 00:09:11 - 00:09:12: フィッティングを行うまでこれらの値は見つかりません。 00:09:14 - 00:09:16: もちろん、最初にフィッティングを行う必要があります。 00:09:16 - 00:09:18: わかりやすくするために、そうする必要があります。 もうかなり明確になりましたが、 00:09:18 - 00:09:20: fitment はトレーニングされています。基本的には 00:09:21 - 00:09:23: fit を使用するのと同じものです。scikit-learn がそれを 00:09:23 - 00:09:26: 何かと呼ぶときにそれを使用するため、 00:09:26 - 00:09:29: とにかく、うーん、self W self B があるので、 00:09:29 - 00:09:31: それがあなたのものです 分類し 00:09:31 - 00:09:33: てから、基本的に分類を返します。 00:09:37 - 00:09:39: グラフなども作成しますが、 00:09:39 - 00:09:44: 今のところはこのままにしておきます。 00:09:46 - 00:09:48: 視覚化の部分に到達したときにプロットします。 00:09:48 - 00:09:50: ここでこれをカットし、次の 00:09:50 - 00:09:53: チュートリアルでおそらく 00:09:53 - 00:09:55: 装備全体を構築することになるので、全体の 00:09:55 - 00:09:58: 最適化と W と B の検出について説明しますが、 00:09:58 - 00:10:00: サポート ベクター マシンの真の核心は 00:10:00 - 00:10:01: ここでわかります。 かなりの 00:10:01 - 00:10:04: 量が構築できたので、とりあえず 00:10:04 - 00:10:06: 今はここまでです。質問があれば、 00:10:06 - 00:10:08: この時点までのコメントに懸念がある場合は、お気軽に 00:10:08 - 00:10:10: お知らせください。そうでない場合は、いつものように、 00:10:10 - 00:10:11: ご覧いただきありがとうございます。すべてのサポート 00:10:11 - 00:10:15: と購読に感謝します。また次回まで

sentdex

※本サイトに掲載されているチャンネル情報や動画情報はYouTube公式のAPIを使って取得・表示しています。

Timetable

動画タイムテーブル

動画数:1254件

- Introduction to Dataset building for fine-tuning. - Building an LLM fine-tuning Dataset

- Introduction to Dataset building for fine-tuning.

Building an LLM fine-tuning Dataset
2024年03月07日 
00:00:00 - 00:02:53
- The Reddit dataset options (Torrent, Archive.org, BigQuery) - Building an LLM fine-tuning Dataset

- The Reddit dataset options (Torrent, Archive.org, BigQuery)

Building an LLM fine-tuning Dataset
2024年03月07日 
00:02:53 - 00:06:07
- Exporting BigQuery Reddit (and some other data) - Building an LLM fine-tuning Dataset

- Exporting BigQuery Reddit (and some other data)

Building an LLM fine-tuning Dataset
2024年03月07日 
00:06:07 - 00:14:44
- Decompressing all of the gzip archives - Building an LLM fine-tuning Dataset

- Decompressing all of the gzip archives

Building an LLM fine-tuning Dataset
2024年03月07日 
00:14:44 - 00:25:13
- Re-combining the archives for target subreddits - Building an LLM fine-tuning Dataset

- Re-combining the archives for target subreddits

Building an LLM fine-tuning Dataset
2024年03月07日 
00:25:13 - 00:28:29
- How to structure the data - Building an LLM fine-tuning Dataset

- How to structure the data

Building an LLM fine-tuning Dataset
2024年03月07日 
00:28:29 - 00:40:40
At  you mention to be unsure of the meaning of the parent id's in the dataset. The reddit post you linked to the BigQuery contains a SELECT statement with REGEXP_REPLACE of 't[0-9]_' on the link_id. According to GPT-4, link_id is a field that represents the ID of the post (submission) to which a comment belongs. Reddit IDs for posts and comments, often prefixed with a type indicator t1-t6 are: Comment, Account (user), Link (post or submission), Message (private message), Subreddit, Award - Building an LLM fine-tuning Dataset

At you mention to be unsure of the meaning of the parent id's in the dataset. The reddit post you linked to the BigQuery contains a SELECT statement with REGEXP_REPLACE of 't[0-9]_' on the link_id. According to GPT-4, link_id is a field that represents the ID of the post (submission) to which a comment belongs. Reddit IDs for posts and comments, often prefixed with a type indicator t1-t6 are: Comment, Account (user), Link (post or submission), Message (private message), Subreddit, Award

Building an LLM fine-tuning Dataset
2024年03月07日  @BoobieBusiness 様 
00:32:15 - 01:01:55
- Building training samples and saving to database - Building an LLM fine-tuning Dataset

- Building training samples and saving to database

Building an LLM fine-tuning Dataset
2024年03月07日 
00:40:40 - 00:48:49
- Creating customized training json files - Building an LLM fine-tuning Dataset

- Creating customized training json files

Building an LLM fine-tuning Dataset
2024年03月07日 
00:48:49 - 00:54:11
- QLoRA training and results - Building an LLM fine-tuning Dataset

- QLoRA training and results

Building an LLM fine-tuning Dataset
2024年03月07日 
00:54:11 - 01:01:55
Bro also considered the fall 🤣🤣 - Getting Back on Grid

Bro also considered the fall 🤣🤣

Getting Back on Grid
2024年02月08日  @muna4840 様 
00:02:25 - 00:21:09
and how about power over ethernet? to daisy chain all the required switches along the line. Or just use optic cable with ethernet-optic converter - Getting Back on Grid

and how about power over ethernet? to daisy chain all the required switches along the line. Or just use optic cable with ethernet-optic converter

Getting Back on Grid
2024年02月08日  @ErugoPurakushiOne 様 
00:04:40 - 00:21:09
- - Open Source AI Inference API w/ Together

-

Open Source AI Inference API w/ Together
2023年12月25日  @brytonkalyi277 様 
00:02:01 - 00:05:13
). No one in his right mind can risk or even bare to put anything rotten into his body nor put the rotten thing closer to the those which are not rotten. Sin makes the heart unclean but you can ask God to forgive you, to save your soul, to cleanse you of your sin, to purify your heart by the blood of His Son, our Lord and Savior, Jesus Christ which He shed here on earth - "But He was wounded for our transgressions, He was bruised for our iniquities; the chastisement for our peace was upon Him, and by His stripes we are healed", Isaiah - Open Source AI Inference API w/ Together

). No one in his right mind can risk or even bare to put anything rotten into his body nor put the rotten thing closer to the those which are not rotten. Sin makes the heart unclean but you can ask God to forgive you, to save your soul, to cleanse you of your sin, to purify your heart by the blood of His Son, our Lord and Savior, Jesus Christ which He shed here on earth - "But He was wounded for our transgressions, He was bruised for our iniquities; the chastisement for our peace was upon Him, and by His stripes we are healed", Isaiah

Open Source AI Inference API w/ Together
2023年12月25日  @brytonkalyi277 様 
00:05:08 - 00:53:05
You must read your Bible slowly, attentively and repeatedly, having this in mind that Christianity is not a religion but a Love relationship. It is measured by the love you have for God and the love you have for your neighbor. Matthew  says, "You are the salt of the earth; but if the salt loses its flavor, how shall it be seasoned? It is then good for nothing but to be thrown out and trampled underfoot by men." Our spirits can only be purified while in the body (while on earth) but after death anything unpurified (unclean) cannot enter Heaven Gates. Blessed are the pure in heart, for they shall see God (Matthew - Open Source AI Inference API w/ Together

You must read your Bible slowly, attentively and repeatedly, having this in mind that Christianity is not a religion but a Love relationship. It is measured by the love you have for God and the love you have for your neighbor. Matthew says, "You are the salt of the earth; but if the salt loses its flavor, how shall it be seasoned? It is then good for nothing but to be thrown out and trampled underfoot by men." Our spirits can only be purified while in the body (while on earth) but after death anything unpurified (unclean) cannot enter Heaven Gates. Blessed are the pure in heart, for they shall see God (Matthew

Open Source AI Inference API w/ Together
2023年12月25日  @brytonkalyi277 様 
00:05:13 - 00:05:08
_`∆_ I believe we are meant to be like Jesus in our hearts and not in our flesh. But be careful of AI, for it knows only things of the flesh which are our fleshly desires and cannot comprehend things of the spirit such as true love and eternal joy that comes from obeying God's Word. Man is a spirit and has a soul but lives in a body which is flesh. When you go to bed it is the flesh that sleeps, but your spirit never sleeps and that is why you have dreams, unless you have died in peace physically. More so, true love that endures and last is a thing of the heart. When I say 'heart', I mean 'spirit'. But fake love, pretentious love, love with expectations, love for classic reasons, love for material reasons and love for selfish reasons those are things of the flesh. In the beginning God said let us make man in our own image, according to our likeness. Take note, God is Spirit and God is Love. As Love He is the source of it. We also know that God is Omnipotent, for He creates out of nothing and He has no beginning and has no end. That means, our love is but a shadow of God's Love. True love looks around to see who is in need of your help, your smile, your possessions, your money, your strength, your quality time. Love forgives and forgets. Love wants for others what it wants for itself. However, true love works in conjunction with other spiritual forces such as patience and faith - in the finished work of our Lord and Savior, Jesus Christ, rather than in what man has done such as science, technology and organizations which won't last forever. To avoid sin and error which leads to the death of your body and your spirit-soul in hell fire (second death), you must make God's Word the standard for your life, not AI. If not, God will let you face AI on your own (with your own strength) and it will cast the truth down to the ground, it will be the cause of so much destruction like never seen before, it will deceive many and take many captive in order to enslave them into worshipping it and abiding in lawlessness. We can only destroy ourselves but with God all things are possible. God knows us better because He is our Creater and He knows our beginning and our end. The prove text can be found in the book of John -44, 2 Thessalonians 12, Daniel 2, Daniel 7-9, Revelation 13-15, Matthew 24-25 and Luke 21. - Open Source AI Inference API w/ Together

_`∆_ I believe we are meant to be like Jesus in our hearts and not in our flesh. But be careful of AI, for it knows only things of the flesh which are our fleshly desires and cannot comprehend things of the spirit such as true love and eternal joy that comes from obeying God's Word. Man is a spirit and has a soul but lives in a body which is flesh. When you go to bed it is the flesh that sleeps, but your spirit never sleeps and that is why you have dreams, unless you have died in peace physically. More so, true love that endures and last is a thing of the heart. When I say 'heart', I mean 'spirit'. But fake love, pretentious love, love with expectations, love for classic reasons, love for material reasons and love for selfish reasons those are things of the flesh. In the beginning God said let us make man in our own image, according to our likeness. Take note, God is Spirit and God is Love. As Love He is the source of it. We also know that God is Omnipotent, for He creates out of nothing and He has no beginning and has no end. That means, our love is but a shadow of God's Love. True love looks around to see who is in need of your help, your smile, your possessions, your money, your strength, your quality time. Love forgives and forgets. Love wants for others what it wants for itself. However, true love works in conjunction with other spiritual forces such as patience and faith - in the finished work of our Lord and Savior, Jesus Christ, rather than in what man has done such as science, technology and organizations which won't last forever. To avoid sin and error which leads to the death of your body and your spirit-soul in hell fire (second death), you must make God's Word the standard for your life, not AI. If not, God will let you face AI on your own (with your own strength) and it will cast the truth down to the ground, it will be the cause of so much destruction like never seen before, it will deceive many and take many captive in order to enslave them into worshipping it and abiding in lawlessness. We can only destroy ourselves but with God all things are possible. God knows us better because He is our Creater and He knows our beginning and our end. The prove text can be found in the book of John -44, 2 Thessalonians 12, Daniel 2, Daniel 7-9, Revelation 13-15, Matthew 24-25 and Luke 21.

Open Source AI Inference API w/ Together
2023年12月25日  @brytonkalyi277 様 
00:05:31 - 00:02:01
Heaven is God's throne and the dwelling place for God's angels and the saints. Hell was meant for the devil (satan) and the fallen angels. Those who torture the souls in hell are demons (unclean spirits). Man's spirit is a free moral agent. You can either yield yourself to God or to the devil because God has given us discretion. If one thinks he possesses only his own spirit, he is lying to himself and he is already in the dark. God is light while the devil is darkness. Light (Holy Spirit) and darkness (evil spirit) cannot stay together in a man's body. God is Love (Love is light) and where there is no love is hell, just as where there is no light is darkness. The one you yield yourself to, you will get his reward. The reward of righteousness to man's spirit is life (abundant life) and the reward of sin to man's spirit is death. Sin and satan are one and the same. Whatever sin can cause, satan also can cause. Sin is what gives the devil dominion or power over man's spirit. When God's Word becomes a part of you, sin power over you is broken, you become the righteousness of God through Christ Jesus. Where Jesus is, you are and when He went (to the Father), you went. In the book of John -47, Jesus said to them, “If God were your Father, you would love Me, for I proceeded forth and came from God; nor have I come of Myself, but He sent Me. Why do you not understand My speech? Because you are not able to listen to My word. You are of your father the devil, and the desires of your father you want to do. He was a murderer from the beginning, and does not stand in the truth, because there is no truth in him. When he speaks a lie, he speaks from his own resources, for he is a liar and the father of it. Which of you convicts Me of sin? And if I tell the truth, why do you not believe Me? He who is of God hears God’s words; therefore you do not hear, because you are not of God.” May God bless His Word in the midst of our hearts. Glory and honour be to God our Father, our Lord and Savior Jesus Christ and our Helper the Holy Spirit. Watch and pray!... Thank you for your time and may God bless you as you share this message with others. - Open Source AI Inference API w/ Together

Heaven is God's throne and the dwelling place for God's angels and the saints. Hell was meant for the devil (satan) and the fallen angels. Those who torture the souls in hell are demons (unclean spirits). Man's spirit is a free moral agent. You can either yield yourself to God or to the devil because God has given us discretion. If one thinks he possesses only his own spirit, he is lying to himself and he is already in the dark. God is light while the devil is darkness. Light (Holy Spirit) and darkness (evil spirit) cannot stay together in a man's body. God is Love (Love is light) and where there is no love is hell, just as where there is no light is darkness. The one you yield yourself to, you will get his reward. The reward of righteousness to man's spirit is life (abundant life) and the reward of sin to man's spirit is death. Sin and satan are one and the same. Whatever sin can cause, satan also can cause. Sin is what gives the devil dominion or power over man's spirit. When God's Word becomes a part of you, sin power over you is broken, you become the righteousness of God through Christ Jesus. Where Jesus is, you are and when He went (to the Father), you went. In the book of John -47, Jesus said to them, “If God were your Father, you would love Me, for I proceeded forth and came from God; nor have I come of Myself, but He sent Me. Why do you not understand My speech? Because you are not able to listen to My word. You are of your father the devil, and the desires of your father you want to do. He was a murderer from the beginning, and does not stand in the truth, because there is no truth in him. When he speaks a lie, he speaks from his own resources, for he is a liar and the father of it. Which of you convicts Me of sin? And if I tell the truth, why do you not believe Me? He who is of God hears God’s words; therefore you do not hear, because you are not of God.” May God bless His Word in the midst of our hearts. Glory and honour be to God our Father, our Lord and Savior Jesus Christ and our Helper the Holy Spirit. Watch and pray!... Thank you for your time and may God bless you as you share this message with others.

Open Source AI Inference API w/ Together
2023年12月25日  @brytonkalyi277 様 
00:08:42 - 00:25:25
This video exemplifies the challenge. - Open Source AI Inference API w/ Together

This video exemplifies the challenge.

Open Source AI Inference API w/ Together
2023年12月25日  @EnricoRos 様 
00:15:00 - 00:15:43
"would have saved me some trouble" - Open Source AI Inference API w/ Together

"would have saved me some trouble"

Open Source AI Inference API w/ Together
2023年12月25日  @EnricoRos 様 
00:15:43 - 00:25:25
@ Farmers' Almanac to Neural Networks - Open Source AI Inference API w/ Together

@ Farmers' Almanac to Neural Networks

Open Source AI Inference API w/ Together
2023年12月25日  @myhificloud 様 
00:24:17 - 00:25:25
. Meditation in the Word of God is a visit to God because God is in His Word. We know God through His Word because the Word He speaks represent His heart's desires. Meditation is a thing of the heart, not a thing of the mind. Thinking is lower level while meditation is upper level. You think of your problems, your troubles but inorder to meditate, you must let go of your own will, your own desires, your own ways and let the Word you read prevail over thinking process by thinking of it more and more, until the Word gets into your blood and gains supremacy over you. That is when meditation comes - naturally without forcing yourself, turning the Word over and over in your heart. You can be having a conversation with someone while meditating in your heart - saying 'Thank you, Jesus...' over and over in your heart. But it is hard to meditate when you haven't let go of offence and past hurts. Your pain of the past, leave it for God, don't worry yourself, Jesus is alive, you can face tomorrow, He understands what you are passing through today. Begin to meditate on this prayer day and night (in all that you do), "Lord take more of me and give me more of you. Give me more of your holiness, faithfulness, obedience, self-control, purity, humility, love, goodness, kindness, joy, patience, forgiveness, wisdom, understanding, calmness, perseverance... Make me a channel of shinning light where there is darkness, a channel of pardon where there is injury, a channel of love where there is hatred, a channel of humility where there is pride..." The Word of God becomes a part of us by meditation, not by saying words but spirit prayer (prayer from the heart). When the Word becomes a part of you, it will by its very nature influence your conduct and behavior. Your bad habits, you will no longer have the urge to do them. You will think differently, dream differently, act differently and talk differently - if something does not qualify for meditation, it does not qualify for conversation. - Open Source AI Inference API w/ Together

. Meditation in the Word of God is a visit to God because God is in His Word. We know God through His Word because the Word He speaks represent His heart's desires. Meditation is a thing of the heart, not a thing of the mind. Thinking is lower level while meditation is upper level. You think of your problems, your troubles but inorder to meditate, you must let go of your own will, your own desires, your own ways and let the Word you read prevail over thinking process by thinking of it more and more, until the Word gets into your blood and gains supremacy over you. That is when meditation comes - naturally without forcing yourself, turning the Word over and over in your heart. You can be having a conversation with someone while meditating in your heart - saying 'Thank you, Jesus...' over and over in your heart. But it is hard to meditate when you haven't let go of offence and past hurts. Your pain of the past, leave it for God, don't worry yourself, Jesus is alive, you can face tomorrow, He understands what you are passing through today. Begin to meditate on this prayer day and night (in all that you do), "Lord take more of me and give me more of you. Give me more of your holiness, faithfulness, obedience, self-control, purity, humility, love, goodness, kindness, joy, patience, forgiveness, wisdom, understanding, calmness, perseverance... Make me a channel of shinning light where there is darkness, a channel of pardon where there is injury, a channel of love where there is hatred, a channel of humility where there is pride..." The Word of God becomes a part of us by meditation, not by saying words but spirit prayer (prayer from the heart). When the Word becomes a part of you, it will by its very nature influence your conduct and behavior. Your bad habits, you will no longer have the urge to do them. You will think differently, dream differently, act differently and talk differently - if something does not qualify for meditation, it does not qualify for conversation.

Open Source AI Inference API w/ Together
2023年12月25日  @brytonkalyi277 様 
00:53:05 - 00:08:42
- - INFINITE Inference Power for AI

-

INFINITE Inference Power for AI
2023年12月17日  @user-cq1wc5tz7c 様 
00:02:01 - 00:18:02
><>< I believe we are meant to be like Jesus in our hearts and not in our flesh. But be careful of AI, for it is just our flesh and that is it. It knows only things of the flesh (our fleshly desires) and cannot comprehend things of the spirit such as peace of heart (which comes from obeying God's Word). Whereas we are a spirit and we have a soul but live in the body (in the flesh). When you go to bed it is your flesh that sleeps but your spirit never sleeps (otherwise you have died physically) that is why you have dreams. More so, true love that endures and last is a thing of the heart (when I say 'heart', I mean 'spirit'). But fake love, pretentious love, love with expectations, love for classic reasons, love for material reasons and love for selfish reasons that is a thing of our flesh. In the beginning God said let us make man in our own image, according to our likeness. Take note, God is Spirit and God is Love. As Love He is the source of it. We also know that God is Omnipotent, for He creates out of nothing and He has no beginning and has no end. That means, our love is but a shadow of God's Love. True love looks around to see who is in need of your help, your smile, your possessions, your money, your strength, your quality time. Love forgives and forgets. Love wants for others what it wants for itself. Take note, true love works in conjunction with other spiritual forces such as patience and faith (in the finished work of our Lord and Savior, Jesus Christ, rather than in what man has done such as science, technology and organizations which won't last forever). To avoid sin and error which leads to the death of our body and also our spirit in hell fire, we should let the Word of God be the standard of our lives not AI. If not, God will let us face AI on our own and it will cast the truth down to the ground, it will be the cause of so much destruction like never seen before, it will deceive many and take many captive in order to enslave them into worshipping it and abiding in lawlessness. We can only destroy ourselves but with God all things are possible. God knows us better because He is our Creater and He knows our beginning and our end. Our prove text is taken from the book of John -44, 2 Thessalonians 12, Daniel 2, Daniel 7-9, Revelation 13-15, Matthew 24-25 and Luke 21. Let us watch and pray... God bless you as you share this message to others. - INFINITE Inference Power for AI

><>< I believe we are meant to be like Jesus in our hearts and not in our flesh. But be careful of AI, for it is just our flesh and that is it. It knows only things of the flesh (our fleshly desires) and cannot comprehend things of the spirit such as peace of heart (which comes from obeying God's Word). Whereas we are a spirit and we have a soul but live in the body (in the flesh). When you go to bed it is your flesh that sleeps but your spirit never sleeps (otherwise you have died physically) that is why you have dreams. More so, true love that endures and last is a thing of the heart (when I say 'heart', I mean 'spirit'). But fake love, pretentious love, love with expectations, love for classic reasons, love for material reasons and love for selfish reasons that is a thing of our flesh. In the beginning God said let us make man in our own image, according to our likeness. Take note, God is Spirit and God is Love. As Love He is the source of it. We also know that God is Omnipotent, for He creates out of nothing and He has no beginning and has no end. That means, our love is but a shadow of God's Love. True love looks around to see who is in need of your help, your smile, your possessions, your money, your strength, your quality time. Love forgives and forgets. Love wants for others what it wants for itself. Take note, true love works in conjunction with other spiritual forces such as patience and faith (in the finished work of our Lord and Savior, Jesus Christ, rather than in what man has done such as science, technology and organizations which won't last forever). To avoid sin and error which leads to the death of our body and also our spirit in hell fire, we should let the Word of God be the standard of our lives not AI. If not, God will let us face AI on our own and it will cast the truth down to the ground, it will be the cause of so much destruction like never seen before, it will deceive many and take many captive in order to enslave them into worshipping it and abiding in lawlessness. We can only destroy ourselves but with God all things are possible. God knows us better because He is our Creater and He knows our beginning and our end. Our prove text is taken from the book of John -44, 2 Thessalonians 12, Daniel 2, Daniel 7-9, Revelation 13-15, Matthew 24-25 and Luke 21. Let us watch and pray... God bless you as you share this message to others.

INFINITE Inference Power for AI
2023年12月17日  @user-cq1wc5tz7c 様 
00:05:31 - 00:02:01
~ Yeah, I remember trying to implement a single-image (RGB) to depth model in 2018 based on a research paper and it was a PITA (missing details, incomplete public implementations, hard to find datasets, fewer cloud offerings). The community was already growing at a good pace and a few people even helped me a bit, but it was still a niche. Fast forward 3-5 years and this plug-and-play that you mentioned is only getting more and more amazing! I even found models doing this optimized for mobile! - INFINITE Inference Power for AI

~ Yeah, I remember trying to implement a single-image (RGB) to depth model in 2018 based on a research paper and it was a PITA (missing details, incomplete public implementations, hard to find datasets, fewer cloud offerings). The community was already growing at a good pace and a few people even helped me a bit, but it was still a niche. Fast forward 3-5 years and this plug-and-play that you mentioned is only getting more and more amazing! I even found models doing this optimized for mobile!

INFINITE Inference Power for AI
2023年12月17日  @kmouratidis 様 
00:07:40 - 00:18:02
Regarding the point on the philisophical question of whether or not Ai should have rights similar to humans, you mention briefly at  that internet data is a pretty decent representaion of what the average person "thinks". I would tend to disagree with this, and to remain as pedantic and romantic as possible, I will address my point through a line of poetry...(That I wrote without the help of Ai, mind you.) - INFINITE Inference Power for AI

Regarding the point on the philisophical question of whether or not Ai should have rights similar to humans, you mention briefly at that internet data is a pretty decent representaion of what the average person "thinks". I would tend to disagree with this, and to remain as pedantic and romantic as possible, I will address my point through a line of poetry...(That I wrote without the help of Ai, mind you.)

INFINITE Inference Power for AI
2023年12月17日  @ConorFenlon 様 
00:14:06 - 00:18:02
x faster than the one implemented in this one at  or so. I expect that it will become exponentially faster with a dataset 5GB in size. I used groupby in another part and that naively halved the time, but it's still far from optimized and I'll probably upload my findings here with a ~5GB dataset running on Colab sometime in the next couple of weeks. After that, I'll try the CuDF version. - Pandas Dataframes on your GPU w/ CuDF

x faster than the one implemented in this one at or so. I expect that it will become exponentially faster with a dataset 5GB in size. I used groupby in another part and that naively halved the time, but it's still far from optimized and I'll probably upload my findings here with a ~5GB dataset running on Colab sometime in the next couple of weeks. After that, I'll try the CuDF version.

Pandas Dataframes on your GPU w/ CuDF
2023年11月11日  R.K. Vigneshwar 様 
00:06:45 - 00:12:04
I'm running late for work, but wouldn't it be possible to vectorize this code and it be faster than both the CuDF and the CPU versions of this benchmark? Curious to see how CuDF plays with vectorized versions. If I get the time I'll try some experiments and update this comment. - Pandas Dataframes on your GPU w/ CuDF

I'm running late for work, but wouldn't it be possible to vectorize this code and it be faster than both the CuDF and the CPU versions of this benchmark? Curious to see how CuDF plays with vectorized versions. If I get the time I'll try some experiments and update this comment.

Pandas Dataframes on your GPU w/ CuDF
2023年11月11日  R.K. Vigneshwar 様 
00:09:01 - 00:12:04
- Why QLoRA? - QLoRA is all you need (Fast and lightweight model fine-tuning)

- Why QLoRA?

QLoRA is all you need (Fast and lightweight model fine-tuning)
2023年09月16日 
00:00:00 - 00:00:55
- LoRA/QLoRA Research - QLoRA is all you need (Fast and lightweight model fine-tuning)

- LoRA/QLoRA Research

QLoRA is all you need (Fast and lightweight model fine-tuning)
2023年09月16日 
00:00:55 - 00:04:13
- Fine-tuning dataset - QLoRA is all you need (Fast and lightweight model fine-tuning)

- Fine-tuning dataset

QLoRA is all you need (Fast and lightweight model fine-tuning)
2023年09月16日 
00:04:13 - 00:11:10
- QLoRA Training Process - QLoRA is all you need (Fast and lightweight model fine-tuning)

- QLoRA Training Process

QLoRA is all you need (Fast and lightweight model fine-tuning)
2023年09月16日 
00:11:10 - 00:15:02
- QLoRA Adapters - QLoRA is all you need (Fast and lightweight model fine-tuning)

- QLoRA Adapters

QLoRA is all you need (Fast and lightweight model fine-tuning)
2023年09月16日 
00:15:02 - 00:17:10
- Merging, Dequantizing, and Sharing - QLoRA is all you need (Fast and lightweight model fine-tuning)

- Merging, Dequantizing, and Sharing

QLoRA is all you need (Fast and lightweight model fine-tuning)
2023年09月16日 
00:17:10 - 00:19:34
i feel like gpt-4 does that when a person decides to use a persona in the prompts. Especially  when using custom instructions. - QLoRA is all you need (Fast and lightweight model fine-tuning)

i feel like gpt-4 does that when a person decides to use a persona in the prompts. Especially when using custom instructions.

QLoRA is all you need (Fast and lightweight model fine-tuning)
2023年09月16日  Prince Mars 様 
00:19:32 - 00:23:56
- WSB QLoRA fine-tuned model examples - QLoRA is all you need (Fast and lightweight model fine-tuning)

- WSB QLoRA fine-tuned model examples

QLoRA is all you need (Fast and lightweight model fine-tuning)
2023年09月16日 
00:19:34 - 00:23:56
Hilarious [Terraform i see you] - QLoRA is all you need (Fast and lightweight model fine-tuning)

Hilarious [Terraform i see you]

QLoRA is all you need (Fast and lightweight model fine-tuning)
2023年09月16日  Vijeth Moudgalya 様 
00:21:04 - 00:23:56
🧩 A low resource text classification method using K nearest neighbors and gzip compression for sentiment analysis. - Gzip is all You Need! (This SHOULD NOT work)

🧩 A low resource text classification method using K nearest neighbors and gzip compression for sentiment analysis.

Gzip is all You Need! (This SHOULD NOT work)
2023年07月29日  Otto Zhao 様 
00:00:00 - 00:01:48
[  ] ... Telling Clients: They don't need deep learning - Gzip is all You Need! (This SHOULD NOT work)

[ ] ... Telling Clients: They don't need deep learning

Gzip is all You Need! (This SHOULD NOT work)
2023年07月29日  WhoIsAbishag 様 
00:00:30 - 00:19:47
📊 The proposal involves compressing text and using normalized compression distances (NCD) as features for the K nearest neighbors classifier. - Gzip is all You Need! (This SHOULD NOT work)

📊 The proposal involves compressing text and using normalized compression distances (NCD) as features for the K nearest neighbors classifier.

Gzip is all You Need! (This SHOULD NOT work)
2023年07月29日  Otto Zhao 様 
00:01:48 - 00:04:17
@VTunnel, You are wrong. At  “And then we’ll to that to the combined string.” - Gzip is all You Need! (This SHOULD NOT work)

@VTunnel, You are wrong. At “And then we’ll to that to the combined string.”

Gzip is all You Need! (This SHOULD NOT work)
2023年07月29日  Terje Oseberg 様 
00:03:50 - 00:19:47
This is to simple yet so crazy and it sounds too simple to be true - Gzip is all You Need! (This SHOULD NOT work)

This is to simple yet so crazy and it sounds too simple to be true

Gzip is all You Need! (This SHOULD NOT work)
2023年07月29日  Harry Tsang 様 
00:03:51 - 00:19:47
⏱️ The slowest part of the algorithm is computing the NCDs for all training samples, while the K nearest neighbors classification is fast. - Gzip is all You Need! (This SHOULD NOT work)

⏱️ The slowest part of the algorithm is computing the NCDs for all training samples, while the K nearest neighbors classification is fast.

Gzip is all You Need! (This SHOULD NOT work)
2023年07月29日  Otto Zhao 様 
00:04:17 - 00:06:07
🏅 Achieved around 70% accuracy in sentiment analysis with just 500 samples using K nearest neighbors and NCDs, outperforming random classification. - Gzip is all You Need! (This SHOULD NOT work)

🏅 Achieved around 70% accuracy in sentiment analysis with just 500 samples using K nearest neighbors and NCDs, outperforming random classification.

Gzip is all You Need! (This SHOULD NOT work)
2023年07月29日  Otto Zhao 様 
00:06:07 - 00:09:12
Great video, but at  you should have said compressions of every concatted pair of texts, then your explanation would have been much clearer! Then at - Gzip is all You Need! (This SHOULD NOT work)

Great video, but at you should have said compressions of every concatted pair of texts, then your explanation would have been much clearer! Then at

Gzip is all You Need! (This SHOULD NOT work)
2023年07月29日  Clockwork Luke 様 
00:07:10 - 00:07:12
"similar sentimented pairs" etc. - Gzip is all You Need! (This SHOULD NOT work)

"similar sentimented pairs" etc.

Gzip is all You Need! (This SHOULD NOT work)
2023年07月29日  Clockwork Luke 様 
00:07:12 - 00:19:47
❓ Questions remain about the method's validity, potential problems, and why NCDs alone would be sufficient for sentiment classification. - Gzip is all You Need! (This SHOULD NOT work)

❓ Questions remain about the method's validity, potential problems, and why NCDs alone would be sufficient for sentiment classification.

Gzip is all You Need! (This SHOULD NOT work)
2023年07月29日  Otto Zhao 様 
00:09:12 - 00:10:36
❓ The creator expresses skepticism about the method, questioning the validity of comparing lengths of compressions for sentiment analysis. - Gzip is all You Need! (This SHOULD NOT work)

❓ The creator expresses skepticism about the method, questioning the validity of comparing lengths of compressions for sentiment analysis.

Gzip is all You Need! (This SHOULD NOT work)
2023年07月29日  Otto Zhao 様 
00:10:36 - 00:11:00
ummm... mind blown. C'MON! - Gzip is all You Need! (This SHOULD NOT work)

ummm... mind blown. C'MON!

Gzip is all You Need! (This SHOULD NOT work)
2023年07月29日  Mark Barbarich 様 
00:10:58 - 00:19:47
📈 Accuracy varies significantly depending on the sample size, reaching around 75.7% for 10,000 samples. - Gzip is all You Need! (This SHOULD NOT work)

📈 Accuracy varies significantly depending on the sample size, reaching around 75.7% for 10,000 samples.

Gzip is all You Need! (This SHOULD NOT work)
2023年07月29日  Otto Zhao 様 
00:11:00 - 00:13:20
⏱️ Linear NCD calculation for 10,000 samples takes hours, prompting the need for parallelization using multiprocessing. - Gzip is all You Need! (This SHOULD NOT work)

⏱️ Linear NCD calculation for 10,000 samples takes hours, prompting the need for parallelization using multiprocessing.

Gzip is all You Need! (This SHOULD NOT work)
2023年07月29日  Otto Zhao 様 
00:13:20 - 00:15:00
💻 Practical usage involves compressing input strings, calculating NCD vectors against training samples, and using K nearest neighbors for sentiment classification. - Gzip is all You Need! (This SHOULD NOT work)

💻 Practical usage involves compressing input strings, calculating NCD vectors against training samples, and using K nearest neighbors for sentiment classification.

Gzip is all You Need! (This SHOULD NOT work)
2023年07月29日  Otto Zhao 様 
00:15:00 - 00:17:46
🧠 The method's success challenges the dominance of deep learning, reminding us of the value of revisiting first principles and exploring alternative algorithms like K nearest neighbors for NLP tasks. - Gzip is all You Need! (This SHOULD NOT work)

🧠 The method's success challenges the dominance of deep learning, reminding us of the value of revisiting first principles and exploring alternative algorithms like K nearest neighbors for NLP tasks.

Gzip is all You Need! (This SHOULD NOT work)
2023年07月29日  Otto Zhao 様 
00:17:46 - 00:19:47
@ Bidens' auto prompt?I was thinking of extending nnfsip to wrap each attention and plug them into the context(s)?... - Better Attention is All You Need

@ Bidens' auto prompt?I was thinking of extending nnfsip to wrap each attention and plug them into the context(s)?...

Better Attention is All You Need
2023年07月12日  Jon's Kitchen 様 
00:04:30 - 00:14:29
… if it continues to go down for larger contexts doesn’t that give us very large context windows without performance drop off? - Better Attention is All You Need

… if it continues to go down for larger contexts doesn’t that give us very large context windows without performance drop off?

Better Attention is All You Need
2023年07月12日  Nathan K 様 
00:12:39 - 00:14:29