I have been both secretly, and publicly working in the HTTP client field for an entire year; here is what my newly acquired expertise got us. This will shock you, so be prepared. Seatbelt on! Let's travel back to the future.
As a Python programmer, we should be aware that it is soon to be 2024, but most of the community lives like it's 2015. And we don't always know it. That's insane.
Let us crawl out of the pit we inadvertently dug ourselves, together. We've almost all started by using Requests, and now we are trapped and kind of forced out sometimes. The thing is... Habits die hard!
I was so lazy to migrate my projects from Requests that I decided to tackle the next Requests myself.
🏆 (I) HTTP/2 by Default, HTTP/3 Broadly Available!
👴 Internet Explorer 11 just celebrated its tenth anniversary this year, and you'll never guess that it has (not activated by default*).
Saying that while the Python ecosystem has no HTTP client supporting HTTP/2 and HTTP/3 by default in November 2023.
With great incentives! 👇*: Doesn't this remind you of a particular HTTP client? HTTP/1 by default, optional HTTP/2?
🔀 (II) Multiplexed Connection
- We've just almost murdered Async and made you watch!
Beware that reading down further will follow a " Cannot be unseen " in your head.
🚨 Some ideas are just sticky, like " faster HTTP " requires Async or your code will be slower. Wrong!
Do you, too, work in a company with programs that exceed 100k lines involving synchronous HTTP requests? No way that the company would allow us to rewrite the whole thing to async? Budget and mental sanity reasons? We are here to save the day.
Let's do a quick MCQ!
According to you, how long does this take to run? We are fetching . The controller behind does a sleep(3)
then return a valid JSON output.
from requests import Session
responses = []
with Session() as s:
responses.append(s.get("//pie.dev/delay/3"))
responses.append(s.get("//pie.dev/delay/3"))
print([r.status_code for r in responses])
- A) 6 seconds
- B) 3 seconds
- C) 12 seconds
✅ ¡spuoɔǝs 9 sɐʍ ɹǝʍsuɐ ǝɥ⊥
Again!
from niquests import Session
responses = []
with Session(multiplexed=True) as s:
responses.append(s.get("//pie.dev/delay/3"))
responses.append(s.get("//pie.dev/delay/3"))
print([r.status_code for r in responses])
- A) 6 seconds
- B) 3 seconds
- C) 12 seconds
✅ ¡spuoɔǝs Ɛ sɐʍ ɹǝʍsuɐ ǝɥ⊥
With multiplexed
enabled, you have just transferred the whole async/await to the remote peer! Isn't it awesome?! No hidden magic trick under the carpet, no async, no threads!
There's a non-negligible chance you'd rather write 6 lines with a mere multiplexed=True
!
The great thing here is that we made it completely transparent for our fellow developers.
You are literally one toggle away from leveraging one or many multiplexed connections!
🔒 (III) Enhanced Security
-
TLS 1.2+ was just the starting block
Earlier this year, some HTTP clients decided to enforce TLS 1.2 as the minimum support. We decided to go all in; not only did we enforce TLS 1.2, but we also enforced that Mozilla regularly updates so that we remain safe against sophisticated types of attacks. Quantum CPUs are a thing, right?
No HTTP client is, in fact, capable of providing a basic level of insurance that the peer certificate is in fact (still) valid.
With the growing size of issued Let's Encrypt (short-lived) certificates, more insurance is needed. It's not bulletproof as it follows the fail-safe
strategy that most browsers do unless strict-mode
is enabled.
-
The dependency chain is certified! !
Almost all the dependencies in the chain are verifiable at any moment. They all benefit from reproducible build.
Each of our releases comes with SLSA provenance data (multiple.intoto.jsonl), which can be used to verify the source and provenance of the binaries with the tool.
slsa-verifier verify-artifact ./niquests-3.2.3-py3-none-any.whl --provenance-path multiple.intoto.jsonl --source-uri github.com/jawah/niquests --source-tag
- Certificates are validated against a trustable source 👇 (Spoiler: Your OS)
🪪 (IV) System Root CAs
Almost everyone is using Certifi or has seen it pass through your usual pip install ...
routine. In a nutshell, Certifi is a package that ships with a static collection of certificate authorities or root CAs if you prefer.
Certifi was a mistake, a huge one. A workaround is great to move on, but only for a short period of time.
Here's the cons:
This list of trusted CA is so critical that you should be scared if anything would happen. It is almost never updated by users, so your local copy is most likely too old and .
Well, Certifi does not ship with your company's certificates! So requesting internal services may come with additional painful extra steps! Also for a local development environment that
This is why you should migrate!
Our implementation takes away all of those concerns by looking at your OS trust store. Which is constantly updated in the background and protected properly.
⚡ (V) Faster
You may get up to 3X faster in your usual synchronous context if you ever switch to . Without multiplexing, well, it is roughly aligned to competitors, except with Requests, we are faster in all comparable scenarios.
HTTP/3 is a bit less performant as of today, but still much faster than any competitor in a synchronous context. We are actively working toward making it faster than HTTP/2.
Keep in mind that R̶e̶ has a lot of features and that we made every effort conceivable to keep them!
🧠 (VI) In-Memory Certificate
This one should come with enthusiasm to many advanced users! Python did not allow end-users to load client mTLS certificates and associated keys without having them in a file. Requests did not allow you to pass on your root CAs without a file either.
That's ; we elaborated a clever way to work around this limitation.
🍰 (VII) Object-Oriented Headers
-
We should control ourselves when eating sugar, not when pushing the " sugar syntax " as a programmer.
Who did not, at the very least, waste a bit of time accessing HTTP headers from a response? Well!
import niquests
r = niquests.get("//1.1.1.1")
Now, how should we access the max_age
from header Nel
or Report-To
or Strict-Transport-Security
ASAP?
r.oheaders.nel.max_age
# and
r.oheaders.report_to.max_age
# also
r.oheaders.strict_transport_security.max_age
You are very much awake; yes, this is now that easy! Pseudo-oriented object headers are long overdue.
😫 (VIII) Fixed Known (Painful) Issues
We've heard you, we know how painful some encounters are, especially with bugs! Here is an extract of what we've fixed for you! (in addition to the In-memory client certificate)
- An invalid content-type definition would cause the charset to be evaluated to True, thus making the program crash.
- Given proxies could be mutated when environment proxies were evaluated and injected. This package should not modify your inputs.
- Fixed
Transfer-Encoding
wrongfully added to headers when the body is actually of length 0.
- Function
proxy_bypass_registry
for Windows may be fooled by insufficient control on our end.
- Unattended override of manually provided Authorization if .netrc existed with an eligible entry.
🚦 (IX) Type Annotated
- Knows you are doing right from the start.
Powerful IDEs like can easily leverage our definitions to guide you through the code.
Just spotted something weird.
...And here's why!
📝 Yes, actually typeshed already has some definitions for Requests, but they are (most of them) incomplete.
☎️ (X) Plain Better Cu̶s̶t̶o̶m̶e̶r̶ "Developer"-Care
It's a cliché linked to small entities, but, it can make all the difference. You won't feel wronged or thrown away because I am deeply convinced that knowledge is best acquired by good confrontations.
- It's a promise, we'll never say to go away to Stackoverflow. We will assume this position forever. For us, even the most insignificant Q&A is like a real commitment to the project.
- We have a great turnover for responses to your concerns, proposals, or criticisms.
- Reports are good, always. Never be scared of filling in an issue even if you are not sure of anything. Even less to submit a PR.
➕ Bonus
-
Main feature matrix
-
Compatibility
is compatible with Python, and PyPy 3.7+, and we remain attached to not dropping any interpreter version unless strictly required to!
All releases on the same Major version will be guaranteed to be backward compatible. Unless forced to.
Yes! support , while it's not true Async**, it serves the purpose of having non-blocking code in your event loop! You may combine Async with the usage of a multiplexed connection.
- Drop-in replacement for Requests
You have nothing to do (most of the time) to switch to . A simple CTRL+R
on import requests
to import niquests as requests
suffices! Or on from requests import
to from niquests import
. Any trouble migrating? Come and let us know what happened. We will find a way; trust me.
- Much more you don't know about
Many topics (e.g., improvements) haven't been shared in this article; come and discover
FYI, Niquests is more resilient to most bot detectors provided you'd put the right headers (e.g., UA, Sec-CH-UA, etc..)
**: We ported sync_to_async
used in asgiref, as Niquests is mostly thread-safe. It is transparent for the end-user.
***: Not enabled by default.
📆 What's Next?
With your support, we can project ourselves much more than 2024!
- Advanced proxy combination
Today, we support tunneling from HTTP/1.1, meaning we can to HTTP/1.1 -> HTTP/2 but not HTTP/2 -> HTTP/2 or even less HTTP/3 -> HTTP/2 or 3.
DNS over QUIC is soon to be a must-have.
-
when using QUIC.
-
True Async top to bottom
Will take some time. Not urgent.
- Advanced scheduling for multiplexed connections
Support for request priority weight.
We'll never settle. This is just the starting block.
—
Source:
PyPI:
Docs:
— Thank you for reading!
In the end, I got what I wanted, migrated all my projects, enabled multiplexing where it was pertinent, and gained a substantial boost.