← Back to context

Comment by woile

1 month ago

Hey, ollama run as suggested in hf doesn't seem to work with this model. This worked instead:

ollama pull hf.co/sweepai/sweep-next-edit-1.5B

I've been using it with the Zed editor and it works quite well! Congrats.

This kind of AI are the ones I like and I'm looking to run in my workstation.

  • Could you give the gist / config on how you made it work with Zed ?

    • I had Claude add it as an edit-prediction provider (running locally on llama.cpp on my Macbook Pro). It's been working well so far (including next-edit prediction!), though it could use more testing and tuning. If you want to try it out you can build my branch: https://github.com/ihales/zed/tree/sweep-local-edit-predicti...

      If you have llama.cpp installed, you can start the model with `llama-server -hf sweepai/sweep-next-edit-1.5B --port 11434`

      Add the following to your settings.json:

      ```

        "features": {
          "edit_prediction_provider": { "experimental": "sweep-local" },
        },
        "edit_predictions": {
          "sweep_local": {
            "api_url": "http://localhost:11434/v1/completions",
          },
        }
      

      ```

      Other settings you can add in `edit_predictions.sweep_local` include:

      - `model` - defaults to "sweepai/sweep-next-edit-1.5B"

      - `max_tokens` - defaults to 2048

      - `max_editable_tokens` - defaults to 600

      - `max_context_tokens` - defaults to 1200

      I haven't had time to dive into Zed edit predictions and do a thorough review of Claude's code (it's not much, but my rust is... rusty, and I'm short on free time right now), and there hasn't been much discussion of the feature, so I don't feel comfortable submitting a PR yet, but if someone else wants to take it from here, feel free!

      2 replies →

    • This is it:

      {

          "agent": {
      
              "inline_assistant_model": {
      
                  "model": "hf.co/sweepai/sweep-next-edit-1.5B:latest",
      
                  "provider": "ollama",
      
              },
      
          }
      
      }