Python

158 readers
4 users here now

All things Python programming related. Let's keep it friendly and on-topic :)

founded 2 years ago
MODERATORS
1
4
Koan 18: The Loose Bundle (pythonkoans.substack.com)
submitted 1 month ago by monica_b1998@lemmy.world to c/python
2
 
 

The unearth syntax is like this:

usage: unearth [-h] [--verbose] [--index-url URL] [--find-link LOCATION]
               [--trusted-host HOST] [--no-binary] [--only-binary]
               [--prefer-binary] [--all] [--link-only] [--download [DIR]]
               [--python-version PY_VER] [--abis ABIS] [--implementation IMPL]
               [--platforms PLATFORMS]
               requirement

I want to get the whole list of all wheel files that argostranslate version 1.9.6 is dependent on, because that’s possibly the last version of argos-translate that worked offline.

My first attempt:

$ unearth --find-link /usr/local/src/argos-translate/wheel_cache/ argostranslate=1.9.6
usage: unearth [-h] [--verbose] [--index-url URL] [--find-link LOCATION] [--trusted-host HOST] [--no-binary PACKAGE]
               [--only-binary PACKAGE] [--prefer-binary] [--all] [--link-only] [--download [DIR]]
               requirement
unearth: error: argument requirement: invalid Requirement value: 'argostranslate=1.9.6'

It does not accept my version constraint, despite examples using that kind of syntax. So I had to do this:

$ unearth --all --find-link /usr/local/src/argos-translate/wheel_cache/ argostranslate

It dumped metadata on all versions. I picked through the heap of output and found this:

  {
    "name": "argostranslate",
    "version": "1.9.6",
    "link": {
      "url": "https://files.pythonhosted.org/packages/01/f9/b472322ea3de4752bbec7fb2f169f057872390a9ff35f72a2142d06392ae/argostranslate-1.9.6-py3-none-any.whl",
      "comes_from": "https://pypi.org/simple/argostranslate/",
      "yank_reason": null,
      "requires_python": ">=3.5",
      "metadata": null
    }
  },
  {
    "name": "argostranslate",
    "version": "1.9.6",
    "link": {
      "url": "https://files.pythonhosted.org/packages/6b/fc/13aa57857bee34f62cb9018c5fd4ec56da714431689b62cca9dce30a8877/argostranslate-1.9.6.tar.gz",
      "comes_from": "https://pypi.org/simple/argostranslate/",
      "yank_reason": null,
      "requires_python": ">=3.5",
      "metadata": null
    }
  },

That just gives the wheel for the app itself, not the wheels it depends on. How can I get the full list of wheels that argostranslate is dependent on using unearth?

I should say that I supply the following option because I happen to already have all the wheel files:

--find-link /usr/local/src/argos-translate/wheel_cache/

That’s just to feed unearth as well as possible as an experiment. In fact it makes no difference if I omit the --find-link option. The end game is to be able to get this list in the future when I am starting from zero.

I might be looking to do something like --verbose --download --dry-run, but there is no --dry-run.

My wheel dir (/usr/local/src/argos-translate/wheel_cache/)

currently contains these filesannotated_types-0.7.0-py3-none-any.whl
argostranslate-1.10.0-py3-none-any.whl
blis-1.3.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
catalogue-2.0.10-py3-none-any.whl
certifi-2025.11.12-py3-none-any.whl
charset_normalizer-3.4.4-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl
click-8.3.1-py3-none-any.whl
cloudpathlib-0.23.0-py3-none-any.whl
confection-0.1.5-py3-none-any.whl
ctranslate2-4.6.2-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
cymem-2.0.13-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
emoji-2.15.0-py3-none-any.whl
filelock-3.20.1-py3-none-any.whl
fsspec-2025.12.0-py3-none-any.whl
idna-3.11-py3-none-any.whl
jinja2-3.1.6-py3-none-any.whl
joblib-1.5.3-py3-none-any.whl
markupsafe-3.0.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl
mpmath-1.3.0-py3-none-any.whl
murmurhash-1.0.15-cp311-cp311-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl
networkx-3.6.1-py3-none-any.whl
numpy-2.4.0-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
nvidia_cublas_cu12-12.8.4.1-py3-none-manylinux_2_27_x86_64.whl
nvidia_cuda_cupti_cu12-12.8.90-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
nvidia_cuda_nvrtc_cu12-12.8.93-py3-none-manylinux2010_x86_64.manylinux_2_12_x86_64.whl
nvidia_cuda_runtime_cu12-12.8.90-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
nvidia_cudnn_cu12-9.10.2.21-py3-none-manylinux_2_27_x86_64.whl
nvidia_cufft_cu12-11.3.3.83-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
nvidia_cufile_cu12-1.13.1.3-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
nvidia_curand_cu12-10.3.9.90-py3-none-manylinux_2_27_x86_64.whl
nvidia_cusolver_cu12-11.7.3.90-py3-none-manylinux_2_27_x86_64.whl
nvidia_cusparse_cu12-12.5.8.93-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
nvidia_cusparselt_cu12-0.7.1-py3-none-manylinux2014_x86_64.whl
nvidia_nccl_cu12-2.27.5-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
nvidia_nvjitlink_cu12-12.8.93-py3-none-manylinux2010_x86_64.manylinux_2_12_x86_64.whl
nvidia_nvshmem_cu12-3.3.20-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
nvidia_nvtx_cu12-12.8.90-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
packaging-25.0-py3-none-any.whl
preshed-3.0.12-cp311-cp311-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl
protobuf-6.33.2-cp39-abi3-manylinux2014_x86_64.whl
pydantic-2.12.5-py3-none-any.whl
pydantic_core-2.41.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
pyyaml-6.0.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl
regex-2025.11.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl
requests-2.32.5-py3-none-any.whl
sacremoses-0.1.1-py3-none-any.whl
sentencepiece-0.2.1-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
setuptools-80.9.0-py3-none-any.whl
smart_open-7.5.0-py3-none-any.whl
spacy-3.8.11-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
spacy_legacy-3.0.12-py2.py3-none-any.whl
spacy_loggers-1.0.5-py3-none-any.whl
srsly-2.5.2-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
stanza-1.10.1-py3-none-any.whl
sympy-1.14.0-py3-none-any.whl
thinc-8.3.10-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
torch-2.9.1-cp311-cp311-manylinux_2_28_x86_64.whl
tqdm-4.67.1-py3-none-any.whl
triton-3.5.1-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl
typer_slim-0.21.0-py3-none-any.whl
typing_extensions-4.15.0-py3-none-any.whl
typing_inspection-0.4.2-py3-none-any.whl
urllib3-2.6.2-py3-none-any.whl
wasabi-1.1.3-py3-none-any.whl
weasel-0.4.3-py3-none-any.whl
wrapt-2.0.1-cp311-cp311-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl


I can run unearth on the prefixes of each of those -- but that’s cheating, because in a future installation starting with nothing I will not know those packages or versions.

3
 
 

To download wheel dependencies for various offline machines using a laptop in a cafe with a dicey flakey Internet connection, I would like to know the URLs of the files to fetch so I can use a more robust tool like aria2.

A command like the following is very fragile:

$ python -m pip download --verbose -d ./wheel_cache/ argostranslate

It can only handle the job if the uplink is fast and reliable. The logs of a download look like this:

…
Collecting sacremoses<0.2,>=0.0.53                                                                                                         
  Downloading sacremoses-0.1.1-py3-none-any.whl (897 kB)                                                                                   
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 897.5/897.5 kB 645.5 kB/s eta 0:00:00                                                        
…

It’s a shame the URLs are concealed despite having verbosity, even when there are errors (which is when the URL is most needed). These docs show no way of increasing log verbosity.

UV -- not ideal but the UV method still worth discussion

I heard a suggestion that UV could likely reveal the URLs. UV is becoming a popular replacement for pip* tools. But for me it’s not mature or popular enough (judging by non-existence in official Debian repos). For the record, it would still be interesting to document how to use UV to derive the URLs.

Hacking

I am most interested in deriving the URLs using Debian-supported apps. Hacker methods welcome. E.g. clever use of strace with a pip* command. Though I don’t strace would see URLs, perhaps something like privoxy or socat would. Of course this could be quite tedious because the pip* commands have no simulation mode, so pip must first be given an opportunity to fetch every file. When a fetch fails on one file, it terminates, which would force us to feed 1 URL to Aria2 at a time and do a manually intensive serial procedure.

4
3
submitted 2 months ago* (last edited 2 months ago) by evenwicht to c/python
 
 

I had a working installation for argos translate. Due to a dead fan, I had to move the hard drive to another machine (same model but there are still differences).

Running argos-translate on the replacement machine gave “illegal instruction”. I figured a CPU variation must be in play here. So I ran this:

$ pipx uninstall argostranslate

Which removed ~/.local/pipx/venvs/argostranslate and freed up ~7gb of space. Then to reinstall:

$ cd /usr/local/src/argos-translate; # git cloned
$ python3 -m venv env-t7500
$ source ./env-t7500/bin/activate
$ pipx install --pip-args='--compile --find-links wheel_cache' .

wheel_cache has the whl files I separately fetched with:

$ python -m pip download -d ./wheel_cache/ argostranslate

There were no errors in the installation, but it still gives “illegal instruction” when running argos-translate.

WTF? It should have forced compilation with --compile. Is there something that would have been missed with the uninstallation?

I possibly have the same problem as this bug, but then I have to wonder why it was able to compile. It should have failed at compilation not runtime -- unless we cannot trust the --compile option.

5
2
submitted 2 months ago* (last edited 2 months ago) by evenwicht to c/python
 
 

To download wheel dependencies for various offline machines using a machine that has a dicey flakey Internet connection, how can the --python-version, --platform, and --abi be determined?

These docs say:

 * For the Python version, use sysconfig.get_python_version().
 * For the platform, use packaging.tags.platform_tags().

That’s for programmers and it misses abi. What about for end users of python apps? I ran this on an online machine (not the target):

$ pipx install --pip-args='--dry-run' argostranslate

and pretty printed this JSON from the log file:

(run_subprocess:186): stdout:
{  
   "environment"                       : {  
      "implementation_name"            : "cpython",  
      "implementation_version"         : "3.11.2",  
      "os_name"                        : "posix",  
      "platform_machine"               : "x86_64",  
      "platform_python_implementation" : "CPython",  
      "platform_release"               : "6.xx.x-yy-amd64",  
      "platform_system"                : "Linux",  
      "platform_version"               : "#1 SMP Debian …",  
      "python_full_version"            : "3.11.2",  
      "python_version"                 : "3.11",  
      "sys_platform"                   : "linux"  
   },  
   "python_version"                    : "3.11.2",  
   "sys_path"                          : [  
      "/usr/lib/python311.zip",  
      "/usr/lib/python3.11",  
      "/usr/lib/python3.11/lib-dynload",  
      "/root/.local/pipx/venvs/argostranslate/lib/python3.11/site-packages",  
      "/root/.local/pipx/shared/lib/python3.11/site-packages"  
   ]  
}  

A usage example shows --platform linux_x86_64, which suggests I could use this concatination for platform: sys_platform_platform_machine. But examples of python versions from that page are like “27” and “33”, not “3.11.2”. Is it a matter of dropping the dots?

Is there a better way to get this information?

6
 
 

These instructions are given in the linked page:

git clone https://github.com/argosopentech/argos-translate.git
cd argos-translate
virtualenv env
source env/bin/activate
pip install -e .

I have replaced the last step with this:

PIPX_HOME=${prefix:-/opt/}/pipx PIPX_BIN_DIR=${prefix:-/usr/local}/bin pipx install --force --pip-args='--proxy http://127.0.0.1:8118/' /usr/local/src/argos-translate

Someone told me to use python -m venv instead of virtualenv because virtualenv is becoming obsolete. My biggest problem is that pip (which I guess is called by pipx) has a chronic problem of spontaneously timing out when fetching big whl files. Partial files are not kept and there is no crash recovery, which makes all the pip variants quite primitive. It’s an unusable process for such a fragile tool to be used on unreliable WAN connections.

I also need to install Argos-translate at a remote offline site. So I need to fetch all 7 gigs of the source tree in advance using something like wget or aria2c, and take a drive to the offline site to install argos-translate. How would I go about that? PIP seems to be designed with the bad assumption that every machine is online.

This page has a dead link:

The argostranslate pkg is only ~27k itself and apparently has 7 gigs of dependencies. I don’t imagine there would be a torrent to fetch just 27k, so is it safe to say the torrent gets the whole 7 gigs? If anyone knows the correct link, plz share!

7
 
 

When this command is given:

PIPX_HOME=${prefix:-/opt/}/pipx PIPX_BIN_DIR=${prefix:-/usr/local}/bin pipx install --pip-args='--proxy http://127.0.0.1:8118/ --log-file ~/logs/pip-argostranslate.err --log ~/logs/pip-argostranslate.log' /usr/local/src/argos-translate

The output is:

pipx: error: unrecognized arguments: --log-file --log /root/logs/pip3-argostranslate.log /usr/local/src/argos-translate
(env) localhst:/usr/local/src/argos-translate# PIPX_HOME=${prefix:-/opt/}/pipx PIPX_BIN_DIR=${prefix:-/usr/local}/bin pipx install --pip-args='--proxy http://127.0.0.1:8118/ --log-file ~/logs/pip-argostranslate.err --log ~/logs/pip-argostranslate.log' /usr/local/src/argos-translate

Those so-called unrecognized arguments are documented in the pip man page.

The error msg drops an arg from the output as well, so it seems like an internal parsing problem.

I am just a user trying to install an app -- not a python dev. It seems bizarre that such a mainstream language would have this basic issue. I wonder if pipx is rarely used.. that it’s obscure. Is that the case? Should I be useing pip3 or pip instead?

8