Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

entropix.ipnb session crashing #67

Open
danbri opened this issue Oct 10, 2024 · 3 comments
Open

entropix.ipnb session crashing #67

danbri opened this issue Oct 10, 2024 · 3 comments

Comments

@danbri
Copy link

danbri commented Oct 10, 2024

Trying to run this via Colab (TPU backend) since I'm on a GPU-poor chromebook. The notebook is crashing before I can get final cell to finish.

Last 10 entries in app.log:

Oct 10, 2024, 6:28:38 PM WARNING WARNING:root:kernel be25de06-202d-4903-9732-c75c13d58316 restarted
Oct 10, 2024, 6:28:38 PM INFO KernelRestarter: restarting kernel (1/5), keep random ports
Oct 10, 2024, 6:28:34 PM WARNING ERROR: Unknown command line flag 'xla_latency_hiding_scheduler_rerun'
Oct 10, 2024, 6:21:53 PM WARNING WARNING:root:kernel be25de06-202d-4903-9732-c75c13d58316 restarted
Oct 10, 2024, 6:21:53 PM INFO KernelRestarter: restarting kernel (1/5), keep random ports
Oct 10, 2024, 6:21:35 PM WARNING ERROR: Unknown command line flag 'xla_latency_hiding_scheduler_rerun'
Oct 10, 2024, 6:20:05 PM WARNING WARNING:root:kernel be25de06-202d-4903-9732-c75c13d58316 restarted
Oct 10, 2024, 6:20:05 PM INFO KernelRestarter: restarting kernel (1/5), keep random ports
Oct 10, 2024, 6:20:02 PM WARNING ERROR: Unknown command line flag 'xla_latency_hiding_scheduler_rerun'
Oct 10, 2024, 6:18:20 PM INFO Kernel started: be25de06-202d-4903-9732-c75c13d58316, name: python3

It got as far as emitting some of the canned prompting text:

<|begin_of_text|><|start_header_id|>system<|end_header_id|>
<antThinking>
You're absolutely right. I need to delve deeper into my actual thought processes, including the uncertainties, associations, and even potential biases that arise as I consider the query. My previous responses, while informative, didn't truly capture the nuanced, sometimes messy nature of cognition. I'll strive to provide a more authentic representation of my internal dialogue, including moments of doubt, tangential thoughts, and the process of refining ideas. This should result in a more genuine demonstration of LLM chain of thought, reflection, and self-correction.
</antThinking>

Which number is larger, 9.9 or 9.11?<|eot_id|><|start_header_id|>assistant<|end_header_id|>

<thinking>

It looks like the rest of the repo is updated more frequently. Should the notebook pull the latest code from the repo via git?

@Pclanglais
Copy link

Just to add on: tried to track the issue. Seems to happen while loading the tokenizer with transformer (??). Just using it once was enough to crash it.

@stillmatic
Copy link

maybe try this? https://github.com/stillmatic/entropix/blob/main/entropix_colab.ipynb

in Colab: https://colab.research.google.com/github/stillmatic/entropix/blob/main/entropix_colab.ipynb

uses line magic to pull in code. I can run the torch logic in a notebook locally with this code, if the link above doesn't work, then there might be a problem with the code or colab.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants