Recent comments in /f/technology
nitori OP wrote
Reply to You know you've gone deep into being a reactionary when you find yourself asking why they introduced keepalive to HTTP by nitori
I changed my mind a bit about keepalive. I think that's necessary for reverse proxies when connecting to their upstream lol
But I still question its use in a real server-client model
nitori OP wrote
Reply to comment by flabberghaster in You know you've gone deep into being a reactionary when you find yourself asking why they introduced keepalive to HTTP by nitori
I think TCP FO should be the way to go since it's more elegant imo than keeping a connection open, though unfortunately ossification means it will take a very long while to get all TCP-based services and clients to support it.. There's also privacy issues with its cookies
As for SSL, if we just had tcpcrypt or any other opportunistic encryption we wouldn't need Let's Encrypt or any free TLS lol (I feel like TLS has been abused too much, it should've been more about identity verification than encryption). I'm actually hopeful for Yggdrasil since it's an IPv6 mesh network where end-to-end encryption between IPs is the norm and each IP is a public key
flabberghaster wrote
Reply to You know you've gone deep into being a reactionary when you find yourself asking why they introduced keepalive to HTTP by nitori
IDK I think there is a use for keeping the same stream open if you're a big website serving a lot of clients tbh. Each TCP handshake takes three packets minimum (unless you use TCP fastopen which is its whole own thing), and then if you want SSL on top of that there's even more latency, especially for slow connections, plus the computation, which is small per request but if you're a big site serving a lot of people it adds up. Even if you're not jamming your page full of ten trillion google ads it can add up.
Using the same connection again if you expect the client to make another one pretty soon makes a lot of sense.
I don't do web dev tho so what do I know.
hollyhoppet wrote
Reply to You know you've gone deep into being a reactionary when you find yourself asking why they introduced keepalive to HTTP by nitori
cucked transfer encoding
emma wrote
might wanna add one or two more just to be safe
nitori OP wrote (edited )
also why u no support HTTP/1.0 (which also means no HTTP/0.9) :(
When trying to use http/1.0
and http/0.9
ALPN:
$ openssl s_client -connect jstpst.net:443 -servername jstpst.net -alpn http/1.0
CONNECTED(00000003)
4027744A687F0000:error:0A000460:SSL routines:ssl3_read_bytes:reason(1120):../ssl/record/rec_layer_s3.c:1584:SSL alert number 120
---
no peer certificate available
---
No client certificate CA names sent
---
SSL handshake has read 7 bytes and written 327 bytes
Verification: OK
---
New, (NONE), Cipher is (NONE)
Secure Renegotiation IS NOT supported
Compression: NONE
Expansion: NONE
No ALPN negotiated
Early data was not sent
Verify return code: 0 (ok)
---
When I fake ALPN to http/1.1
:
$ openssl s_client -connect jstpst.net:443 -servername jstpst.net -alpn http/1.1
CONNECTED(00000003)
depth=2 C = US, O = Internet Security Research Group, CN = ISRG Root X1
verify return:1
depth=1 C = US, O = Let's Encrypt, CN = E6
verify return:1
depth=0 CN = jstpst.net
verify return:1
---
[ssl certs and blah blah blah...]
---
read R BLOCK
GET / HTTP/1.0
HTTP/1.0 200 OK
Alt-Svc: h3=":443"; ma=2592000
Server: Caddy
Date: Tue, 23 Jul 2024 07:38:39 GMT
Content-Length: 0
closed
nitori OP wrote
Reply to comment by twovests in Postmill is responding with a semantically wrong HTTP 3xx when submitting a post by nitori
Nope, just tested with my own Postmill instance and it's returning a 302 there too after a POST. Though that one's behind an nginx, tho I doubt nginx mangles with status codes
nitori OP wrote
Reply to comment by hollyhoppet in Postmill is responding with a semantically wrong HTTP 3xx when submitting a post by nitori
200 OK
hollyhoppet wrote
whatever you do, do not tell the cops
twovests wrote
this might be something with my caddy config and not postmill-- not 100% sure
nitori OP wrote
Though on the other hand maybe scratch that idea because I think this is gonna break plenty of old clients lol: https://http.dev/303
twovests wrote
(CW: More explicit references to domestic abuse)
I think security folks tend to think of security against a genius hacker with endless resources, which is a good mindset to have when you're building software and cryptography. But this mindset also makes a lot of security folks obstinately oblivious to reality.
I can't imagine what level of collective delusion the people at Microsoft must be under that they would advertise Windows Recall as a good feature. They must be aware of the blood that will be on their hands, right?
It feels almost like that's the point? "Windows with CoPilot + will help you keep tabs on you and yours, every step of the way."
hollyhoppet OP wrote
Reply to comment by ellynu in good news i don't have to do anything related to LLMs at all by hollyhoppet
oh yeah good point lol
ellynu wrote
Reply to comment by hollyhoppet in good news i don't have to do anything related to LLMs at all by hollyhoppet
but u make happy in the people around u
hollyhoppet OP wrote
Reply to comment by emma in good news i don't have to do anything related to LLMs at all by hollyhoppet
that rules
emma wrote
my boss asked me if i wanted to be an "AI expert" at the beginning of the year, and i just looked at him funny, and now i don't get asked that anymore
hollyhoppet OP wrote
Reply to comment by ellynu in good news i don't have to do anything related to LLMs at all by hollyhoppet
actually it ended up that i have to do barely work on this thing at all
ellynu wrote
instead you have to do stuff with HHM (holly hoppet makestuff)
twovests wrote
reassuring!
hollyhoppet OP wrote (edited )
Reply to comment by twovests in i may be asked to do an integration with an LLM in a couple hours by hollyhoppet
the company is extremely bullish right now on automation as a cost saving measure so unless it's something directly unethical i don't think i have much room to call it out. also yeah we're not hosting our own models it would be through chatgpt lol
best case i can say "i don't know if this will work very well" and do whatever they ask. best best case is i'm only tangentially doing something to enable another team's integration.
twovests wrote
Oh man :\
I'm assuming this isn't a niche case where integrating an LLM makes sense right?
Perhaps you could raise high standards for the business justification and value of adding an LLM. Note the reputational risk of appearing to chase gimmicks at the expense of user experience. Maybe your app demographic is one which would be alienated by adding LLM garbage?
The company I work for has a natural-language processing powered tool and we've still not integrated new LLMs into it AFAIK. (The only information I have about this is what's public knowledge, to note)
Either way, good luck!! If you have to do the LLM integration I hope you can at least host your own models and you can at least make it known how poorly interpretable and how poorly predictable LLMs are.
ellynu wrote
Reply to my review of every computer in existence by hollyhoppet
computers are real and strong and my friend
twovests wrote
Reply to my review of every computer in existence by hollyhoppet
what about if a computer were shaped like hello kitty
voxpoplar wrote
Reply to my review of every computer in existence by hollyhoppet
damb
neku wrote
Reply to You know you've gone deep into being a reactionary when you find yourself asking why they introduced keepalive to HTTP by nitori
dont know what this one means. hope everybody is having a great time in this thread