Author: https://github.com/seanorama
Note: This was tested on HDP 3.1. It may not work with other Spark/YARN distributions.
| # Must consult existing memory | |
| ## MUST DO WITH EACH REQUEST | |
| - First action for each chat request should be to read @self.md and @project.md | |
| - Last action for each chat should be to update @self.md and @project.md if needed. | |
| ## Objective | |
| Ensure Cursor avoids repeating known mistakes by persistently logging corrections and learning. All requests must reference stored knowledge in: | |
| - `.remember/memory/self.md` — for known mistakes and their fixes | |
| - `.remember/memory/project.md` — for user preferences and custom rules |
| from transformers import MarianTokenizer, AutoModelForSeq2SeqLM | |
| text = 'Рада познакомиться' | |
| mname = 'Helsinki-NLP/opus-mt-ru-en' | |
| tokenizer = MarianTokenizer.from_pretrained(mname) | |
| model = AutoModelForSeq2SeqLM.from_pretrained(mname) | |
| input_ids = tokenizer.encode(text, return_tensors="pt") | |
| outputs = model.generate(input_ids) | |
| decoded = tokenizer.decode(outputs[0], skip_special_tokens=True) | |
| print(decoded) #Nice to meet you |
| # create a file and (name your wish) , then add this content to the file. | |
| # OR | |
| # sudo vim /etc/systemd/system/jupyter-lab.service | |
| # paste code below | |
| # Esc , shift z z | |
| [Unit] | |
| Description=Start Jupyter Lab Server at Boot | |
| [Service] |
Author: https://github.com/seanorama
Note: This was tested on HDP 3.1. It may not work with other Spark/YARN distributions.
Now available here: https://github.com/y0ast/pytorch-snippets/tree/main/fast_mnist
| # 求最大公约数 | |
| def gcd(pair): | |
| a, b = pair | |
| low = min(a, b) | |
| for i in range(low, 0, -1): | |
| if a % i == 0 and b % i == 0: | |
| return i | |
| numbers = [ | |
| (1963309, 2265973), (1879675, 2493670), (2030677, 3814172), |
| #!/usr/bin/env python | |
| """ | |
| Remove emoji from a text file and print it to stdout. | |
| Usage | |
| ----- | |
| python remove-emoji.py input.txt > output.txt | |
| """ |
| #!/bin/bash | |
| ### steps #### | |
| # Verify the system has a cuda-capable gpu | |
| # Download and install the nvidia cuda toolkit and cudnn | |
| # Setup environmental variables | |
| # Verify the installation | |
| ### | |
| ### to verify your gpu is cuda enable check |