Comment by antisol
6 days ago
System Requirements
Works literally everywhere
Haha, on one of my machines my python version is too old, and the package/dependencies don't want to install.
On another machie the python version is too new, and the package/dependencies don't want to install.
I opened a couple of PRs to fix this situation:
https://github.com/KittenML/KittenTTS/pull/21 https://github.com/KittenML/KittenTTS/pull/24 https://github.com/KittenML/KittenTTS/pull/25
If you have `uv` installed, you can try my merged ref that has all of these PRs (and #22, a fix for short generation being trimmed unnecessarily) with
Thanks for the quick intro into UV, it looks like docker layers for python
I found the TTS a bit slow so I piped the output into ffplay with 1.2x speedup to make it sound a bit better
Ah, yeah, good catch – I added the model-native speed multiplier to the CLI too (`--speed=1.2` for instance).
1 reply →
Install it with uvx that should solve the python issues.
https://docs.astral.sh/uv/guides/tools/
uv installation:
https://docs.astral.sh/uv/getting-started/installation/
Yeah some people have a problem and think "I'll use Python". Now they have like fifty problems.
I had the too new.
This package is the epitome of dependency hell.
Seriously, stick with piper-tts.
Easy to install, 50MB gives you excellent results and 100MB gives you good results with hundreds of voices.
It doesn't work on Fedora because of the lack of g++ having the right version.
Not sure if they've fixed between then and now, but I just had it working locally on Fedora.
We are working to fix that. Thanks
"Fixing python packaging" is somewhat harder than AGI.
I was commiserating with my brother over how difficult it is to set up an environment to run one LLM or diffusion model, let alone multiple or a combination. It's 5 percent CUDA/ROCm difficulties and 95% Python difficulties. We have a theory that Lanyone working with generative AI has to tolerate output that is only 90% right, and is totaly fine working with a language and environment that only 90% works.
Why is Python so bad at that? It's less kludgy than Bash scripts, but even those are easier to get working.
3 replies →
This is how we'll know ASI has arrived.
Have you considered offering a uvx command to run to get people going quickly?
Though I think you would still need to have the Python build dependencies installed for that to work.
2 replies →
Just point people to uv/uvx.
A tool that was only released, what, a year or two ago? It simply won't be present in nearly all OS/distros. Only modern or rolling will have it (maybe). It's funny when the recommended python dependency manager managers are just as hard to install and use as the script themselves. Very python.
2 replies →
The project is like 80% there by having a pyproject file that should work with uv and poetry. The just aren't any package versions specified and the python version is incredibly lax, and no lock file is provided.
1 reply →
Python man
There you go.
Computer scientists love Python, not just because whitespace comes first ASCIIbetically, but because it's the standard. Everyone else loves Python because it's PYTHON!
2 replies →
You're getting a lot of comments along the lines of "Why don't you just ____," which only shows how Stockholmed the entire Python community is.
With no other language are you expected to maintain several entirely different versions of the language, each of which is a relatively large installation. Can you imagine if we all had five different llvms or gccs just to compile five different modern C projects?
I'm going to get downvoted to oblivion, but it doesn't change the reality that Python in 2025 is unnecessarily fragile.
That’s exactly what I have. The C++ codebases I work on build against a specific pinned version of LLVM with many warnings (as errors) enabled, and building with a different version entails a nonzero amount of effort. Ubuntu will happily install several versions of LLVM side by side or compilation can be done in a Docker container with the correct compiler. Similarly, the TypeScript codebases I work with test against specific versions of node.js in CI and the engine field in package.json is specified. The different versions are managed via nvm. Python is the same via uv and pyproject.yaml.
I don't doubt it, but I don't think that situation is accepted as the default in C/C++ development. For the most part, I expect OSS to compile with my own clang.
I agree with your point, but
> if we all had five different llvms or gccs
Oof, those are poor examples. Most compilers using LLVM other than clang do ship with their own LLVM patches, and cross-compiling with GCC does require installing a toolchain for each target.
Cross-compiling is a totally different subject… I'm trying to make an apples-to-apples comparison. If you compile a lot of OSS C projects for the host architecture, you typically do not need multiple LLVMs or GCCs. Usually, the makefile detects various things about the platform and compiler and then fails with an inscrutable error. But that is a separate issue! haha
> Can you imagine if we all had five different llvms or gccs just to compile five different modern C projects?
Yes, because all I have to do is look at the real world.
There are still people who use machine wide python installs instead of environments? Python dependency hell was already bad years ago, but today it's completely impractical to do it this way. Even on raspberries.
Yep. Python stopped being Python a decade ago. Now there are just innumberable Pythons. Perl... on the otherhand, you can still run any perl script from any time on any system perl interpreter and it works! Granted, perl is unpopular and not getting constant new features re: hardcore math/computation libs.
Anyway, I think I'll stick with Festival 1.96 for TTS. It's super fast even on my core2duo and I have exactly zero chance of getting this Python 3'ish script to run on any machine with an OS older than a handful of years.
It breaks my heart that Perl fell out of favor. Perl “6” didn’t help in the slightest.
Debian pretty much "solved" this by making pip refuse to install packages if you are not in an venv.
It needed distro buy in and implementation, but this is from the Python side: https://peps.python.org/pep-0668/
IIRC that's actually a change in upstream pip.
Well, with my python 3.13.5 not even that works!
Pretty impressive but this seems to be a staple of most AI/ML projects.
"Works on my machine" or "just use docker", although here the later doesn't even seem to be an option.
Ditto OpenSUSE, at least on Tumbleweed
Using venv won't save you from having the wrong version of the actual Python interpreter installed.
venvs can be attached to arbitrary interpreters.
Such an ignorant thing to say for something that requires 25MB RAM.
Not sure what the size has to do with anything.
I send you a 500kb Windows .exe file and claim it runs literally everywhere.
Would it be ignorant to say anything against it because of its size?
we all know runs anywhere in this context means compute wise. It's dumb to blame author for your dev setup issues.
1 reply →
It reminds me of the costs and benefits of RollerCoaster Tycoon being written in assembly language. Because it was so light on resources, it could run on any privately owned computer, or at least anything x86, which was pretty much everything at the time.
Now, RISC architectures are much more common, so instead of the rare 68K Apple/Amiga/etc computer that existed at the time, it's super common to want to run software on an ARM or occasionally RISC-V processor, so writing in x86 assembly language would require emulation, making for worse performance than a compiled language.
system python is for system applications that are known to work together. If you need a python install for something else, there's venv or conda and then pip install stuff.
[flagged]
You're supposed to use venv for everything but the python scripts distributed with your os