Author Topic: Do HTTP/HTTPS servers ever enforce multipart for file downloads?  (Read 952 times)

0 Members and 1 Guest are viewing this topic.

Offline peter-hTopic starter

  • Super Contributor
  • ***
  • Posts: 3694
  • Country: gb
  • Doing electronics since the 1960s...
I posted this elsewhere but few read that section.

So far I have managed to find out that this seems to work:

You just look for first occurence of \r\n\r\n (4 byte sequence) and then the data comes straight out, binary.

> $ telnet xxxxx.com 80
> Trying y.y.y.y...
> Connected to xxxx.com.
> Escape character is '^]'.
> GET /how/img/glide.png HTTP/1.1
> Host: xxxx.com
>
> HTTP/1.1 200 OK
> Date: Sat, 26 Aug 2023 14:10:40 GMT
> Server: Apache/2.4.41 (Ubuntu)
> Last-Modified: Sun, 06 Apr 2014 08:20:34 GMT
> ETag: "hghghghghghg"
> Accept-Ranges: bytes
> Content-Length: 7901
> Content-Type: image/png
>
> <89>PNG^Z (...)

That is easy. It would be great if one was able to avoid the multipart delimiters, because they need parsing which is messy on an embedded client; needs a bit of RAM.

Are there any standards for this stuff?
Z80 Z180 Z280 Z8 S8 8031 8051 H8/300 H8/500 80x86 90S1200 32F417
 

Offline ejeffrey

  • Super Contributor
  • ***
  • Posts: 3713
  • Country: us
Re: Do HTTP/HTTPS servers ever enforce multipart for file downloads?
« Reply #1 on: August 27, 2023, 10:01:51 pm »
Under normal circumstances a http(s) server will not send a multipart response to a client.   Multipart is normally only used for client to server form submission with files via the POST method. 
 

Offline HwAoRrDk

  • Super Contributor
  • ***
  • Posts: 1471
  • Country: gb
Re: Do HTTP/HTTPS servers ever enforce multipart for file downloads?
« Reply #2 on: August 27, 2023, 10:28:45 pm »
The only common, standard situation I'm aware of where a server will issue a multipart response to a client is when a request has explicitly been made for multiple byte ranges using a Range header.

Other than that, I suspect an HTTP server would only do so if a specific URL was made to do so for a proprietary purpose. You're highly unlikely to encounter that in the wild though. In fact, I suspect such a server would only give you a multipart response if you gave it an Accept header containing multipart/* (or something more specific).
 

Offline golden_labels

  • Super Contributor
  • ***
  • Posts: 1208
  • Country: pl
Re: Do HTTP/HTTPS servers ever enforce multipart for file downloads?
« Reply #3 on: August 27, 2023, 10:39:20 pm »
A multipart response doesn’t require more memory to process than any other response, in practical terms. I’d say this renders the entire question dismissible.

A server may respond with multipart data. Realistically this is only going to happen if you already expect receiving a multipart response. HTTP services may be roughly divided into two groups: these meant for web browsers, and API endpoints. So we deal with the following scenarios:
  • A browser making a simple request for normal content:
    The browser expects a single object in response and it makes no sense for a server to send multiple objects. Browsers couldn’t even handle that.
  • A browser makes a request to a dynamically changing resource:
    Primarily to webcams sending low framerate images stream over HTTP. The response is realized over multipart/x-mixed-replace. You must make a request to that kind of a resource to receive a multipart response.
  • A client makes a request for multiple byte ranges:
    Aside from not very wide adoption of this feature, the client must make an explicit request for multipart data. Which implies it expects the multipart response and can handle it.
  • API endpoints:
    Rarely encountered, but possible. If you make a request to a particular API endpoint, you already know what kind of response you are going to receive. So a multipart response is possible only, if you already wanted one.
Making a plain request without expecting multipart data and recieving a multipart response would be more than weird.
People imagine AI as T1000. What we got so far is glorified T9.
 

Offline peter-hTopic starter

  • Super Contributor
  • ***
  • Posts: 3694
  • Country: gb
  • Doing electronics since the 1960s...
Re: Do HTTP/HTTPS servers ever enforce multipart for file downloads?
« Reply #4 on: August 29, 2023, 06:22:34 am »
Thank you all. This has been extremely interesting and useful!
Z80 Z180 Z280 Z8 S8 8031 8051 H8/300 H8/500 80x86 90S1200 32F417
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6242
  • Country: fi
    • My home page and email address
Re: Do HTTP/HTTPS servers ever enforce multipart for file downloads?
« Reply #5 on: August 29, 2023, 07:08:15 am »
Are there any standards for this stuff?
Not really; only internet RFCs, which are the next best thing to an internationally accepted standard.  See httpwg.org specifications.

The one you need to follow currently is HTTP 1.1 as specified in RFC 9112, because that's what most browsers will use.  The only transfer coding you are required to support is chunked.

While almost everything still supports HTTP 1.0 as specified in RFC 1945, for any product/application intended for interoperability with generic clients, I do recommend shifting to HTTP 1.1.



At least nowadays both browsers and servers send correctly formed MIME multipart data queries (POST file upload) and responses (various, as mentioned in previous messages).  I did web stuff back when MS fucked with everybody and had severe bugs in Internet Exploder they refused to fix (no CRLF after latter multipart boundary, keep-alive bugs), and although workarounds were not too onerous, finding out about them was; especially because a majority of users believed that Microsoft could not be wrong, therefore everyone else were just doing it wrong and needed to conform...  I mean, even desktops tended to have 16 MiB to 64 MiB of RAM at that time, before the turn of the century.  Things could be much worse, really.

(That's not to say things couldn't be much better, with a HTTP and HTML/XML-compatible protocol you could parse with a simple state machine, while still being human-readable and -writable.  That is one of the topics I do still like to rant about.)
 

Offline peter-hTopic starter

  • Super Contributor
  • ***
  • Posts: 3694
  • Country: gb
  • Doing electronics since the 1960s...
Re: Do HTTP/HTTPS servers ever enforce multipart for file downloads?
« Reply #6 on: August 29, 2023, 08:09:59 am »
Quote
for any product/application intended for interoperability with generic clients

This is for downloading files from web servers. It is a client, HTTPS.

Specifically it is for downloading the latest cacert.pem file which comes from here: https://curl.se/docs/caextract.html
Z80 Z180 Z280 Z8 S8 8031 8051 H8/300 H8/500 80x86 90S1200 32F417
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6242
  • Country: fi
    • My home page and email address
Re: Do HTTP/HTTPS servers ever enforce multipart for file downloads?
« Reply #7 on: August 29, 2023, 08:42:59 am »
This is for downloading files from web servers. It is a client, HTTPS.

Specifically it is for downloading the latest cacert.pem file which comes from here: https://curl.se/docs/caextract.html
I'd use a web server under my own control.  It would cache the cacert.pem, and serve it using a server configured to return it in a HTTP/1.0 -compatible form, i.e.
Code: [Select]
HTTP/1.1 200 OK\r\n
Connection: close\r\n
Content-Length: 221470\r\n
Content-Type: application/x-pem-file\r\n
Date: Tue, 29 Aug 2023 08:30:45 GMT\r\n
Last-Modified: Tue, 22 Aug 2023 03:12:04 GMT\r\n
Cache-Control: max-age=1800\r\n
Expires: Tue, 22 Aug 2023 04:07:46 GMT\r\n
\r\n
... 221470 byte PEM file omitted for brevity...
That way, if the URL changes, only the web server configuration needs to be updated.  And you won't annoy curl.se either.

Many existing web hosts that cost only a couple of euros per month (I happen to use OVH, with a Lets Encrypt HTTPS sertificate) can be used for this.  Just configure the particular URI used in your clients, say https://www.yourserver.domain/ca-certificates.pem, to be provided by a CGI script (or better yet, served as an as-is file, ie. including its own headers), and have that CGI script emit the contents of the local file as above.  Then, set up the local file to be updated (atomically, i.e. using the low-level link() or rename() method to replace the file when complete) at regular intervals.

I'd also let my users reconfigure that URI to something else if they so want, as well as the interval at which to re-read the root certificate chain.
 

Offline HwAoRrDk

  • Super Contributor
  • ***
  • Posts: 1471
  • Country: gb
Re: Do HTTP/HTTPS servers ever enforce multipart for file downloads?
« Reply #8 on: August 29, 2023, 10:04:51 am »
Yes, that is a good idea, to cache/proxy the file at a domain and URL that you control.

Not only is it polite to take the load off the source (especially if there are going to be thousands of devices out there updating at frequent intervals), but you can mitigate any changes in location or availability of the source.

Although, on the other hand, I suppose for SSL root certs there is something to be said for "getting them straight from the source" (cURL project not being the source, of course, but a trusted source).
 

Offline peter-hTopic starter

  • Super Contributor
  • ***
  • Posts: 3694
  • Country: gb
  • Doing electronics since the 1960s...
Re: Do HTTP/HTTPS servers ever enforce multipart for file downloads?
« Reply #9 on: August 29, 2023, 10:43:46 am »
Quote
I'd use a web server under my own control.

There is a trust issue with that, from the customer's POV. If you can corrupt the cert store, all bets are off :) And the mfg (me) would have to maintain that server for ever, a tenner a month in reality, but there isn't a charging mechanism in place.

Quote
especially if there are going to be thousands of devices out there updating at frequent intervals

It would be perhaps once a month.

The URL will be customer-configurable in a config file, accessible over USB. Which takes us back to "security = 0 without physical security" but this futureproofs that aspect of the product.

Quote
I suppose for SSL root certs there is something to be said for "getting them straight from the source"

Exactly. Otherwise the whole business of peer authentication using certificates is bogus. You just need to install a fake DNS server and it's all yours :)
Z80 Z180 Z280 Z8 S8 8031 8051 H8/300 H8/500 80x86 90S1200 32F417
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6242
  • Country: fi
    • My home page and email address
Re: Do HTTP/HTTPS servers ever enforce multipart for file downloads?
« Reply #10 on: August 29, 2023, 11:25:30 am »
Quote
I'd use a web server under my own control.

There is a trust issue with that, from the customer's POV. If you can corrupt the cert store, all bets are off :)
If they don't trust the manufacturer, why would they trust the device?  Besides, I already described that the URLs and what they provide must be easily verifiable by the end user anyway.

In practice, you are part of the trust chain.  Externalizing that node to e.g. curl.se only makes the trust chain weaker, not stronger, and may piss off curl.se maintainers.

For a wider view of what can happen, look up NTP.  There are vendors who simply pick a subset of NTP servers – often Linux distributions' own NTP servers that have nothing to do with that vendor or the OS the vendor uses –, and hardcode the server IP or name into their product, sometimes overloading the NTP servers so much that the distro or NTP server admin changes the name and/or IP address, breaking network time updates for that vendor, who then blames the NTP server for "breaking things".  It's not a good thing, and you don't want to cause that kind of ruckus.

At minimum, you'd need to ask curl.se for permission.

Quote
I suppose for SSL root certs there is something to be said for "getting them straight from the source"

Exactly. Otherwise the whole business of peer authentication using certificates is bogus. You just need to install a fake DNS server and it's all yours :)
The upstream source for the Mozilla CA Certificate List is here at Mozilla Wiki, specifically the PEM of Root Certificates in Mozilla's Root Store with the Websites (TLS/SSL) Trust Bit Enabled (TXT) link.  The Wiki page includes a FAQ question on whether/how one can use the CA list.  You can use OpenSSL command-line tools on the server to parse the PEM data, to provide the same file or similar as the curl.se page does.

And the mfg (me) would have to maintain that server for ever, a tenner a month in reality, but there isn't a charging mechanism in place.

The "tenner a month" (it's not, it's more like a pound or two per month, and you can pay several years up front) is and should be included in the cost of the device.  I'd put any updates et cetera on the same server, possibly behind a customer-specific download key (similar to username-password pair) required for each download, that also provides an unified log of downloads and attempts (use the database the web host provides for this).

It is the cost of doing Internet-capable development kits, really.
 

Offline peter-hTopic starter

  • Super Contributor
  • ***
  • Posts: 3694
  • Country: gb
  • Doing electronics since the 1960s...
Re: Do HTTP/HTTPS servers ever enforce multipart for file downloads?
« Reply #11 on: August 29, 2023, 11:36:20 am »
Quote
I'd put any updates et cetera on the same server, possibly behind a customer-specific download key (similar to username-password pair) required for each download, that also provides an unified log of downloads and attempts (use the database the web host provides for this).

That is in the future somewhere, and it isn't trivial. I think we had a thread on how to do OTA firmware updates. For example one needs to exclude large volume users (or limit them to older versions) because if one of them gets everything bricked, their lawyers can destroy your company.
« Last Edit: August 29, 2023, 12:08:00 pm by peter-h »
Z80 Z180 Z280 Z8 S8 8031 8051 H8/300 H8/500 80x86 90S1200 32F417
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf