But if you’re already using npm in your project, as a significant number of front end projects are, it means you can serve them directly rather than requiring an external CDN.
Storing your food in the fridge requires an entire fridge (and power, and...)
Your particular circumstances might not require it (maybe you're just temporarily camping or you only store non-perishable food) but that doesn't mean that fridges, in general, are unnecessary or less convenient than just storing food in a cupboard. Even if you only eat in restaurants and you don't need a fridge, the restaurant does.
This being packaged won't prevent it from being delivered from a CDN. It will actually make it easier to automatically deploy all versions to CDNs as they are published, like in https://www.jsdelivr.com/, while being CDN-only is less convenient when you actually need the many affordances that a package manager provides.
If I already have a package manager and do:
yarn add css-extras
And then in my code:
import 'css-extras';
...and I get it versioned in my package.json, cached, available offline, not subject to link rot, automatically inserted in my bundle, processed, minified and with its dead code eliminated... that's surely more convenient than vendoring from a CDN and manually doing all that process (or worse, not doing it at all and just dropping a raw <link> to the CDN in my HTML, with all its drawbacks).
Welp, time to make a @function preprocessor. There is no reason for every single client to recalculate things which could have been completely or partially calculated at build time.
I used to share this sentiment (and I’m a web performance consultant by profession so very few people care about performance as much as me!), but when you consider how much calculation we _happily_ let our JS do at runtime, I don’t think forcing CSS to be static/preprocessed is worth it. And that’s not even me taking a swipe at overly-JSsed front-end; I’m talking about any runtime work that JS picks up.
Is preprocessed CSS faster? Yes. Is it meaningfully faster? Probably not.
An optimisation I've always wondered about for transforming/translating/animating elements: is it faster to use JS translations or animation API directly on the element (e.g. style.transform / element.animate), or updating CSS variables with JS to let the CSS engine reposition inheriting elements?
In the context of animations, I'd intuit the latter but would be open to hearing why.
But there are also plenty of use cases where recalculation will be valuable in the client. CSS variables cascade so a preprocessor isn't going to be able to know ahead of time what any given variable value is.
Sure, the new syntax allows doing some nifty stuff with the cascade. In practice, however, I foresee most usage being simple one-time transformations of design tokens. I suppose it is more of a theme architecturing issue.
It’s the same with Custom Properties. There are plenty of situations where they are useful at runtime, but a lot of their use is just a single definition on :root, and people really would be better served by the likes of Sass variables, because they foil all kinds of optimisations. You end up with things like color-mix(in srgb, var(--some-unreasonably-long-name), transparent) where it could have just been #1234567f. Quite apart from the runtime and memory costs, bundle size (even gzipped size) can frequently be reduced drastically by flattening eligible variables.
It's a tradeoff. I expect this to be non-trivial, do nothing in the general case (when referring to runtime CSS vars) and possibly increase your final CSS size for any sufficiently complex codebase when unrolled.
This may feel true if you've re-engaged with CSS's progression in the last ~5–7 years. In reality, the last big qualitative leap was Grid in 2017.
This project is based on just one new proposed rule which won't be available in all mainstream browsers until 2027-28, and won't be safe for production use until close to the end of the decade.
It's a post about web development on HN. Half the comments will rag incessantly, half will talk about how the web should go back to being a delivery mechanism for documents only like it's 1995 forever, someone will rant about Google for some reason. It's a neverending nightmare.
I'm no expert in this domain, but I suspect it's less "this is a bad problem to solve" and more "every solution to a problem moves farther away from the ideal simplicity of a markup langage."
(I'm not weighing in on the validity of this position, just reporting what I perceive the position itself to be.)
People who don’t understand the problems CSS has to solve are opposed to CSS solving those problems. Sure it can all be stuffed into Tailwind classes!!
It would be quite nice to see some more "killer" uses of this new feature that aren't just "we removed some duplication and... saved less than 1% of our loc".
And maybe there are some really compelling ones... I think the only really useful one I see here is `--abs`, which really should just be built-in.
Whoo. I’ll be the first hyper negative prototypical HN commenter.
I’m glad I don’t work on browser engines for a living. CSS is getting more complex and spaghetti-capable by the day.
> Currently only supported in Chrome 141+. The @function rule is going through the W3C standardization process and will be available in other browsers soon.
Also, pretty tired of Chrome effectively front-running standards as a not-so-subtle means of cramming them through.
Web standards are in the same boat as C++. They can never really deprecate anything, but they want shiny new things, so they just add and add on top of the pile.
Every feature sounds great in isolation, but in aggregate they become a moloch.
Then people say “modern CSS is great, you just have to pick the ‘good subset’.”, but then nobody can agree what that subset should be, and everybody else uses a different subset.
LLMs also contribute to this, as 90% of what’s available on the web is considered outdated now, but that is the majority of training data.
One person's front-running is another's reference implementation.
Although, yes, CSS is getting more complex because everything on the web is. What's the last standard feature to really be taken away after actually existing in the wild for a while? XHTML and Flash (effectively a standard if not in reality)?
So I guess it really is true that nothing actually gets removed -- except the one that wasn't actually controlled by WhatWG or W3C.
Is there still a real-world use case for XHTML/"XML syntax for HTML", or is this just exhibit A that no standard can actually be removed from browsers?
Re: XSLT, back in the everything-is-XML days I desperately wanted to like XSLT, it seemed so useful (I was that annoying co-worker telling everyone it's supposed to be pronounced "exalt"). But it was such a disaster to actually write or read and no real debugging was possible, I had to use a LOT of conditional bgcolor=red to figure anything out. It didn't take very long to come to the conclusion that XPath was the only useful part.
If I need the markup of a page to not contain any structural errors, I often use XHTML for testing at least because, though it's a little more verbose, if there's a nesting error, for example, the browser will flat out refuse to render it and show some sort of stacktrace error page instead. So it's quite a good built-in "tool" for checking that your markup is clean.
With HTML, everything goes and the browser will happily render broken markup, which is probably the correct default for the web as a whole. After all, you surely don't want a page like Wikipedia to show an error message to its users because a developer forgot to close a tag somewhere.
Luckily, CSS Modules are starting to land in multiple browsers. Firefox added support behind a flag, and it might ship in 145.
So you'll be able to import the CSS from your JS modules, and apply it to the document:
Or, if you use shadow DOM:On the contrary. CSS functions and mixins may make a lot of current cruft unnecessary.
Is anything really necessary? Not snark: almost nothing is necessary in life but many things are convenient.
https://cdn.jsdelivr.net/npm/css-extras@0.3.1/index.css
They also happen to be a great attack vector! You and your users are much better off not using them for anything but toys.
Your particular circumstances might not require it (maybe you're just temporarily camping or you only store non-perishable food) but that doesn't mean that fridges, in general, are unnecessary or less convenient than just storing food in a cupboard. Even if you only eat in restaurants and you don't need a fridge, the restaurant does.
This being packaged won't prevent it from being delivered from a CDN. It will actually make it easier to automatically deploy all versions to CDNs as they are published, like in https://www.jsdelivr.com/, while being CDN-only is less convenient when you actually need the many affordances that a package manager provides.
If I already have a package manager and do:
And then in my code: ...and I get it versioned in my package.json, cached, available offline, not subject to link rot, automatically inserted in my bundle, processed, minified and with its dead code eliminated... that's surely more convenient than vendoring from a CDN and manually doing all that process (or worse, not doing it at all and just dropping a raw <link> to the CDN in my HTML, with all its drawbacks).Is preprocessed CSS faster? Yes. Is it meaningfully faster? Probably not.
In the context of animations, I'd intuit the latter but would be open to hearing why.
This project is based on just one new proposed rule which won't be available in all mainstream browsers until 2027-28, and won't be safe for production use until close to the end of the decade.
Of note from 2023: subgrids, :has, container queries, nesting... And in 2022, cascade layers (plus <style scoped>, I mean @scope, I mean :scope).
https://caniuse.com/?search=%40function
(I'm not weighing in on the validity of this position, just reporting what I perceive the position itself to be.)
And maybe there are some really compelling ones... I think the only really useful one I see here is `--abs`, which really should just be built-in.
https://developer.mozilla.org/en-US/docs/Web/CSS/abs
There are also other non-custom math functions like round() which can be very useful.
I’m glad I don’t work on browser engines for a living. CSS is getting more complex and spaghetti-capable by the day.
> Currently only supported in Chrome 141+. The @function rule is going through the W3C standardization process and will be available in other browsers soon.
Also, pretty tired of Chrome effectively front-running standards as a not-so-subtle means of cramming them through.
https://github.com/w3c/csswg-drafts/issues/9350
Every feature sounds great in isolation, but in aggregate they become a moloch.
Then people say “modern CSS is great, you just have to pick the ‘good subset’.”, but then nobody can agree what that subset should be, and everybody else uses a different subset.
LLMs also contribute to this, as 90% of what’s available on the web is considered outdated now, but that is the majority of training data.
Although, yes, CSS is getting more complex because everything on the web is. What's the last standard feature to really be taken away after actually existing in the wild for a while? XHTML and Flash (effectively a standard if not in reality)?
Is there still a real-world use case for XHTML/"XML syntax for HTML", or is this just exhibit A that no standard can actually be removed from browsers?
Re: XSLT, back in the everything-is-XML days I desperately wanted to like XSLT, it seemed so useful (I was that annoying co-worker telling everyone it's supposed to be pronounced "exalt"). But it was such a disaster to actually write or read and no real debugging was possible, I had to use a LOT of conditional bgcolor=red to figure anything out. It didn't take very long to come to the conclusion that XPath was the only useful part.
XSLT is a W3C standard:
https://www.w3.org/TR/xslt/
If I need the markup of a page to not contain any structural errors, I often use XHTML for testing at least because, though it's a little more verbose, if there's a nesting error, for example, the browser will flat out refuse to render it and show some sort of stacktrace error page instead. So it's quite a good built-in "tool" for checking that your markup is clean.
With HTML, everything goes and the browser will happily render broken markup, which is probably the correct default for the web as a whole. After all, you surely don't want a page like Wikipedia to show an error message to its users because a developer forgot to close a tag somewhere.