I always wonder why people compare Latex with word and not with the single most popular document markup (especially here): HTML + css + javascript.
The problems are quite similar, "How do I center a div?" vs "How do I keep this float on this page?" Has latex really modernized? I don't hear a lot about new layouts or style mechanisms.
Most people are probably reading articles online these days, although there is a lot to be said about printing an article to read. It seems to me that adding responsiveness to journal articles instead of using a fixed paper layout regardless of media might be a good improvement for many readers in many situations.
There are many reasons this comparison is not made. I will just touch on one. The target medium is different. For html, you have monitors of different sizes as well as windows that can be resized. For latex, you choose your target at the start: A4 paper? Screen presentation? A0 poster?
With a fixed medium in mind, you can be extremely particular on where on this canvas you want a piece of text/graphic or whatever.
Without a fixed medium, you have to have logic to address the different mediums and compromises have to be made.
That seems contradictory, when Latex is rather famously imprecise at placing figures and such. Weren't both languages (at least at some point) intended to take layouting control away from the writer?
But regardless, I think that, in addition to moving away from Latex we should also reconsider the primary output format. Documents are rarely printed anymore, and inaccessible, fixed-size A4 pdfs are annoying to read on anything but an iPad Pro.
LaTeX isn't intended to take layout control away from the author so much as it is intended to automatically produce a good-enough layout allowing a single author to produce a very large document without employing a designer.
HTML by contrast explicitly does remove control over layout from the author and place it in the hands of the user (and their chosen user agent).
Both languages have mechanisms to (somewhat) separate the content from the formatting rules.
HTML+CSS has facilities to target a page format (CSS @page rule, cm and in dimension units). Not to say that it's on the same level as LaTeX, but it's pretty impressive by its own right.
Are there good deep dives on how far you can practically this? Especially in combination with headless browser pdf generation?
Last time I looked into it, a while ago, my impression was that it would get rickety too soon. It’d be a good place to be, I think, if web and “document” tech stacks could have nice and practical convergence.
We use CSS paged media to create e-books and invoices (using weasyprint [0]). One of the most helpful resources for me was print-css.rocks [1], they cover a lot of what’s possible and include which tools support which parts of it (tools targeting paged media, browser support is essentially non-existent and outside using JS to fake it with paged.js, not relevant). The expensive tools tend to support more features, but thanks to some donations/sponsorships, weasyprint has really caught up and now supports a very large part of the spec.
> Especially in combination with headless browser pdf generation
I have no idea why you’d want to do that. Browsers are bad at it, dedicated tools are great at it.
I'd say it's already there. See for example the https://pagedjs.org/ project which allows advanced typesetting (including for printing) using web technologies. It is already used in production by at least one book publisher (C&F editions)
I've used it for my own such production, perfect binding with a hand guillotine and screw clamps in my attic - nothing remotely professional, but you still have to start by making a
book block, and Paged.js is a solid call there. Unless beauty of typography (more than TTF/OTF hinting can handle) is of particular merit, it's usually my preferred first typesetting option.
As an old hand with PDF-in-browser production, I expected much worse of Paged.js than I found. It's powerful and mostly enjoyable to use! Oh, you end up with a large set of CSS rules, and it is not without bugs and gotchas (failing to specify a bleed rule somewhere at least once in every @page context subtly breaks layout; footnote layout is functional but automatic call numbering isn't always perfect, etc.)
You should definitely not expect to take Paged.js out of the box, slap a theme on it, and go; it comes as a box of parts with a mostly complete machine inside, and if it breaks you get to keep all the pieces. I imagine the publisher who uses it must have some prior interest in web technologies, for example.
Nor is Paged.js remotely as
capable or flexible as InDesign or a comparable tool, especially for the deeply rudimentary condition of web typography overall - something even as elaborate a tool as this can't really approach fixing.
But Paged.js is also unlike InDesign in having a much shallower (days vs months) learning curve for folks like us with prior web experience, and however equivocal a review I may now be giving of its technical merits, I do actually like working with Paged.js quite a lot.
I've also used pagedjs for a relatively complex booklet with bidirectional text in different languages, images and long footnotes. The result was great but there were some annoying bugs, some of them seeming to be possible underlying bugs in chrome and Firefox. Still, latex would have been even more frustrating.
Coincidentally, I've also used pagedjs for a project recently (125K novel) and encountered some bugs/minor issues. Overall though, I would say I had an immensely positive experience (because even when stuff broke, it was still just HTML, CSS, and JS--so I, like any other web developer, could fix it).
That said, it's a shame that the relevant W3C specs (see https://pagedjs.org/about/) still aren't fully supported by browsers (but perhaps such is the fate of niche features), but with that being the case, I'm infinitely thankful that pagedjs exists as a polyfill.
Oh, I certainly don't doubt that. And as I said, I haven't really found Paged.js all that frustrating! I have extensive though not recent Pagemaker experience; I expected InDesign to be easier, and now I rue the day when that's where I'm forced to resort.
In my experience Paged.js is at its best when building to PDF, but then that's always my intermediate format when working to paper, because that's where PDF's inflexibility shines. The source of a book block, everything that builds to that PDF, partakes of all the infelicities of the JS ecosystem. But to remake the book itself again, all I need do to start is print the PDF.
I pay for a tool to convert HTML/CSS into PDFs https://www.princexml.com/ and it seems to work well. I don't have the best idea of how it compares to the various free options though.
Yes, sometimes, but I would say that one of the benefits of latex is how easy you can switch to another layout. But I guess the point is that you typically render to a set of outputs with fixed dimensions (pdf)
Its the same reason that Markdown became popular. I want my document to primarily contain content. Not a sea of handwritten tags.
I don't want to manually type (or read past) HTML tags littered around the place. I don't want to manually put <p> tags on my text, or worry about how indentation will affect my rendered output. (For example, <p>foo</p> and <p> foo </p> render differently).
If I'm writing a blog post, I also don't want my post's text to get mixed up with site specific stuff, like meta tags and layout elements.
Are there any good "literate HTML" type tools which first and foremost let me type text, but still let me break into HTML? That I could get behind.
SGML (ISO 8879) has basically all these things: it infers tags (such as for opening paragraphs as in your example, but also infers missing html, head, and body tags, and also infers end-element tags for paragraphs, etc etc), has a built-in mechanism for recognizing custom tokens and turn those into tags to implement markdown and custom syntaxes, provides text macros, and many, many more things (including stylesheets, transformations for things such as table of content generation and search result views).
In other words, SGML is complementing the HTML vocabulary with authoring affordances, as originally intended (HTML is based on it).
> I always wonder why people compare Latex with word and not with... HTML...
At the very least, because those are the two popular software systems used for creating documents. HTML+CSS isn't; and Javascript is irrelevant for print.
A big con is that there are no typst templates for journals and conferences that academics submit papers to. For me, this is a show-stopper. I would love to be able to ditch latex because honestly it's old and it shows a lot, in spite of apologists saying that it's perfect. But 90+% of my usage starts from a conference or journal template, so at the moment it's not gonna happen.
I don't feel like the template itself is the issue.
In typst it's quite easy to recreate the templates without being years into typst (according to my experience).
The real problem is acceptance of non-word/latex papers
> The real problem is acceptance of non-word/latex papers
Some scientific journals, which only provides a Word template, require you to print to PDF to submit, then ships this PDF to India, where a team recreates the look of the submission in LaTeX, which is then used to compose the actual journal. I wish this was hyperbole. For these journals, you can safely create a LaTeX-template looking _almost_ the same, and get away with it.
The problem is the user-base and acceptance of latex vs Typst. I use latex and as aware as I am about its deficiencies, I can create a doc faster in it than any other tool that I have not ever used before. I also have a bunch of utilities I created for my specific use-cases automating data into tables, figures, etc, ready for latex import.
So its a mass and momentum problem. Typst not only has to be better/easier/faster than latex, but to a degree that it justifies all of the labor and time to learn it and change all that existing template and utility infrastructure built up over decades. A high bar.
If Typst (or some other new contender) could also read and compile latex code and packages alongside its own syntax then that would be a game-changer. Then I can use all my old stuff and gradually change things over to typst (or whatever).
Typst is a breath of fresh air. Interacting with modern tooling (GitHub, discord). Responsive developers. Easy to read code. Easy to do things on your own.
Admittedly, my use case is mainly writing books, I've never published an academic paper.
The other option is people who never got into LaTeX get into Typst (usually by being too young to have gotten into LaTeX in college), and Typst takes over slowly that way.
But I thought one of the points of latex was to emit pdf files? Are you saying these places are so backwards they only accept latex and word files? What stops them being edited by someone?
Scientific journals do edit the TeX file. Both to update the visual style (e.g. enabling commercial fonts that they use for print but are not allowed to distribute with the template), and to update the content itself (to revise the grammatical style to fit the style guide followed by that journal, to update scientific references to have clickable links, etc.). Usually, at the end of all these edits, the journal sends a PDF “proof” back to the authors to verify that the final version is OK, or ask for corrections if they broke something (which they often do).
Have you checked out Quarto? There are a lot of templates supported already, and possible to create out of latex if not (or just generate latex from Quarto).
Is there any side-by-side comparison of a page created by LaTeX by Typst?
My main selling points is that with LaTeX, it is easy to create typography shines beauty for a distance. (Often way better that most of books you find in stores.) With other typesetting systems, usually it is not the case. Yet, I am waiting for new things that offer simplicity, yet have same (or better!) visuals that LaTeX.
As far as I know, the main differences (in the body text) between LaTeX and, say, Word, are the linebreaking algorithm (Knuth-Plass, which is used for both ragged-right and justified text) and the microtypography package. Is there anything else that contributes to the quality of LaTeX's output for ordinary English text?
Typst apparently uses Knuth-Plass, but I don't see any information about microtypography.
From what I see, it is also section breaking, fonts, and general typesetting defaults, such as margins, section, etc (sure, they vary from package to package, and some are ugly, but the default are aesthetically pleasing).
I don't think this was ever my issue with Latex, which instead are mostly:
- the cryptic error messages and infinite logs
- the unintuitive ways to do stuff like store a value for later use or sum two lengths
- the very long compile times
- the amount of reliance on global state from various packages, which contributes to even more cryptic errors or weird behavior when something goes wrong
- various other quirks, e.g. the fact you often need to end a line with a comment or the newline will skrew up your content.
Some is spent on optimizing the results on the paragraph, page, and multi-page level: river elimination, color balance, widow and orphan elimination, etc. I don't know how much of this Typst does; certainly HTML + CSS does none of it.
The non-control characters of ASCII are largely characters you might actually want to put in a document. TeX uses some of these as markup, e.g., the dollar sign to bracket maths mode and the ampersand as a column separator in tables. Typst takes this much further, using plus and minus signs to introduce list items, at signs for references, and so on.
Ideally, all visible characters should produce themselves in the printed output, except for the backslash introducing control sequences that represent all of the markup and braces for delimiting the extent of parameters to those control sequences. This would produce a very predictable and easily-parsed syntax.
Typst is more minimal and faster in compiling documents, I prefer using it.
But it's not in all cases a LaTex replacement. The ecosystem is also larger.
I have LaTex documents I struggle to convert.
Yeah - typst has a bunch of features that I really want for blog posts and rich documentation, where markdown isn't a powerful enough tool. For example:
- Boxes & named figures
- Footnotes
- Variables, functions (incl populated from nearby files)
- Comments
- Chapter / Section headings (& auto generated table of contents)
- Custom formatting rules (For example, typst lets you define your own "warning box". Stuff like that.)
I don't know of a better tool to write my blog posts today. Markdown doesn't have enough features. And I'm obviously not writing blog posts in latex or a rich text editor. I could use actual javascript / JSX or something - but those tools aren't designed well for long form text content. (I don't want to manually add <p> tags around my paragraphs like a savage.)
Pity the html output is still a work in progress. I'm eagerly awaiting it being ready for use!
[^0]: it doesn't matter where this is placed, just that this one has a colon.
The table of contents thing is annoying but it's not hard to write a little bash script. Sed and regex are all you need.
> Markdown doesn't have enough features
Markdown has too many features
The issue is you're using the wrong tool. Markdown is not intended for making fancy documents or blogs, it's meant to be a deadass simple format that can be read in anything. Hell, its goal is to be readable in a text editor so its more about styling. If you really want to use it and have occasional fanciness, you can use html.
But don't turn a tool that is explicitly meant to be simple into something complicated just because it doesn't have enough features. The lack of features is the point.
Yes, I think we're in violent agreement that markdown is the wrong tool for the job. That's why I find it baffling how so many blogging & documentation tools lock you in to using markdown, with its anaemic feature set (eg mdbook).
Even markdown + inline HTML is wildly inadequate. For example, you can't make automatically numbered sections. Or figures with links in the text. Or a ToC. And so on. Try and attach a caption to an image and you're basically hand authoring your document in crappy HTML.
So I agree with you. I don't think the answer is "markdown++" with comments, templating and scripting support. I think the answer is something else. Something which has considered the needs of authoring documents from the start. Something like typst.
> That's why I find it baffling how so many blogging & documentation tools lock you in to using
I feel this about so many things and it boggles my mind why people often choose to do things the hardest way possible.
Honestly, I think a good portion of it of the unwillingness to toss something aside and write something new. If it's just a hack on a hack on a hack on a hack then no wonder it's shit. It's funny that often it's quicker to rewrite than force your way through.
I'm worried that with LLMs and vibe coding on the rise we're just going to get more. Because people will be asking "how do I make X do Y" when in reality you shouldn't ever make X do Y, you need to find a different tool.
> I'm worried that with LLMs and vibe coding on the rise we're just going to get more.
I'm hoping the opposite, at least eventually. I think before long it'll be easy to get chatgpt to build your own version of whatever you want, from scratch.
Eg, "Hey, I want something kinda like markdown but with these other features. Write me the spec. Implement a renderer for documents in Go - and write a vs code extension + language server for it."
But if that happens, we'll get way more fragmentation of the computing ecosystem. Maybe to the point that you really need the memory of a LLM to even know what's out there - let alone understand how to glue everything together.
You missed my concern. Even if LLMs get much but it doesn't mean the users will ask the right questions. Even now many don't ask the right questions, why would it be any better when we just scale the issue?
MDX advertises itself as "markdown + components", but its not commonmark compatible. I tried using it a few years ago. In the process, I migrated over some regular markdown documents and they render incorrectly using MDX.
I filed a bug (this was a few years ago) and I was told commonmark compatibility was an explicit non goal for the project. Meh.
I do remember that too. In fact it was one of my physics teacher who got me into LaTeX - he used to complain about Word while praising LaTeX and its WYSIWYM.
Though I ended being a graphic designer so LaTeX felt rather limiting very quickly, but fortunately found ConTeXt.
Hoped Typst was going to be great for my use case but alas it's got the same "problem" as LaTeX - modularity. Still it seems to be a great alternative for people doing standard documents.
Twenty years ago you say. So that's when it had already been in existence for 20+ years and had been ubiquitous in academia (at least in the sciences) for 10 or more.
How so? Only their web app seems to be closed source. And the company was created by the two project founders. They also don't seem to be doing a lot more than a community project.
Obviously there are differences, but that wasn't the point of my comment. I replied to the claim that latex never needed "marketers". Or did you mean to reply to a different comment?
I meant if there is no company financially benefiting from that activity it is hard to call that marketing. But if there is a company especially if it is backed by VC that is a completely different story.
There is no VC with typst, they're bootstrapped. And I think by "marketeers" the original commenter did not mean actual marketing people, but enthusiastic fans. Unless it was a hidden accusation of astroturfing that I didn't get.
Word 20 years ago was a very different beast compared to word today. For starters, it still had a closed, binary (read: not friendly to source control) format. It also had more bugs than Klendathu.
When you are losing your semester's 25-page seminal work an hour before deadline because Word had that weird little bug about long documents and random CJK characters (and whether or not the moon was currently in the House of Aquarius supposedly), you develop a ... healthy dislike for it.
LaTeX back in the day didn't need zealots - Word did all the heavy lifting in demolishing itself for anything more involved than 'Secretary writes a letter', 'grandma Jones writes down her secret butterball recipe' or 'suits need a text, and only text, on paper, quickly".
(Yes, that was snarky. I am still bitter about that document being eaten.)
Currently you will find that LaTeX is the de facto standard at CERN. Maybe only management would not use it. But CERN gives overleaf professional licence to each member. And all templates I have seen for everything I interacted with that is going into publications are LaTeX.
Well, naturally 20 something years make a difference, although for some others, it looks pretty much the same, as I have visited a few times since then as Alumni.
> For starters, it still had a closed, binary (read: not friendly to source control) format
Word still has a closed format. It supposedly standardized OOXML, but - it doesn't follow that standard; Microsoft apparently managed to warp the XML standard to accommodate its weirdness; and all sorts of details encoded by MSO in that format are not actually documented.
There also used to be the problem of different renderings on different machines (even if you had all the relevant fonts installed): You opened a document on another person's computer and things were out-of-place, styling and spacing a bit different, page transitions not at same point etc. I don't know if that's the case today.
Granted, though, hangs and crashes and weird gibberish on opening a document are rare today.
> You opened a document on another person's computer and things were out-of-place, styling and spacing a bit different, page transitions not at same point etc.
When this happened to me on my job in the late 90s we were able to locate that problem in the printer driver that was visible in the Word print dialog. I don't remember the details but it looked like Word was adjusting font metrics to the metrics of the specific printer, and all the shifted pixels quickly added up to destroy the finely balanced lines of our print publication (yes, an official public health periodical by a European government was typeset with MS Word, and there was a lot of manual typographical work in each print). Given the technology at the time, it's not clear to me whether Word's behavior was a feature (in the sense of: automatically adjusts to your output device for best results) or a bug (automatically destroys your work without asking or telling you when not in its accustomed environment).
When you are the only option marketing doesn't matter.
I would suspect (based on my own experience) is that the reason folks shout "typst!" anytime they hear latex is that the user experience is 1000x better than latex.
Many people say that they use LaTeX because it produces more beautiful output. Microtypography is one of the reasons for that. It's especially noticeable when microtype pushes hyphens or quotes at the end of a line slightly into the margin. (A nearby comment mentions that Typst has this feature, too.)
Nope, people don't need to know that something is done to appreciate the outcome. You might not know that modern MacBooks use ARM processors, but you might still appreciate that they have a long battery life.
Computer Modern is the very last thing I will ever want in a document and is the first thing I change in every LaTeX document I create. It is easily one of the ugliest fonts ever created.
It has a lot of good things going for it, but it is the least attractive font that I think I have ever seen.
I think it's quite attractive, but its attractiveness isn't really why it's desirable; it's because people know it is the font used by proper fancy scientific papers. It's like the opposite of Comic Sans.
How is that pdf made interactive? It has options to toggle the behaviour, which work even in an in browser pdf viewer. I did not think PDFs could do that.
PDFs can do a lot more than show static content. There was one time where Adobe strongly advocated for PDF to be the page format of what would come to be called "The World Wide Web". Where we have HTML now, Adobe wanted PDF. Thankfully that did not happen. But I suspect it would have made more sense technically than [whatever this mess is that we have now.]
Are people looking seriously at shortcomings of latex and moved towards modern replacements?
Major problems include:
- Tables are a huge pain.
- Customized formatting like chapter headings, footers, etc is painful.
- Latex as a language somehow felt like it was having issues with composability of functions, the details of the problem eludes me now, but it was something like if you have a function to make text bold, and if you have another function to make it italic, then if apply one to the output of another, it should give you bold and italic, but such composability was not happening for a some functions.
-Mixing of physical and logical formatting.
-Lot of fine tuning require to get passable final output.
I recall a recent criticism of Typst being that it doesn't strip unused glyphs from fonts when making PDFs so they end up excessively large compared to other solutions. Has there been any change to that?
Does it have better/easier tables. Does it support complex tables like with images in it, with alternating horizontal or vertical text in cells, tables inside tables, tables with alternative row/column shading, etc while still supporting automatic wrapping to contents, etc?
I live in fear that one of the major typesetting services like Overleaf will convince people to move away from a very durable standard and adopt something that’s much more change-oriented. Then we’ll all have to learn not one, but two standards. Rinse repeat.
Well everyone likes free software (as in freedom and beer) but 0 of you pay, while on a 6 figure salary. Meanwhile no hesitation to pay AWS, Netflix, Amazon, etc. all of them net negative contributors to free software.
They are a very small team and this is a known issue - there is a website refresh coming up that will fix it
They developed the main face of the product first - the online webapp which has live collaboration - which sounds like a sane choice for a new company.
Yeah, today's open source combines the worst from corporate jobs and social media. Typst looks nice though, but is indeed developed in a logic of a business
Almost all of typst, except their web app, is available on crates.io and from many Linux distribution repositories. And you can skip the web app if you don't prefer it. There's no loss of functionality.
I find today much easier to contribute to (in the open source sense) than latex. Go to the GitHub and interact with the developers. Who happen to be very responsive.
I used latex for 20+ years and don't know how to file a bug for latex. Do I do it for xelatex, latex? Where? How do I update things? Download 4 gigs? Where's to documentation? Where's a book that explains how to contribute to latex? These are some of the issues I've dealt with and am happy to never have to again.
PDF is used for pre-formatted content with reproducible layout. HTML is used for dynamically formatted, dynamically laid out and often reflowable content. It's debatable whether PDF needs a more modern alternative, but HTML is certainly not a replacement for it. There are several use cases where HTML is not the appropriate choice - especially for carefully laid out documents or books. You can simulate pre-formatted layout in HTML, but it always feels like shoehorning an unintended functionality.
LaTeX and Typst are markup primarily for PDF type contents. Something like Asciidoc or even Markdown is more appropriate for HTML type content. You can always switch the purposes, but I never got a satisfying output by doing that.
HTML with css paged media gets you reproducible layout without having to mess with Latex and keeps you in open toolchains that aren't two decades or more old without any significant improvement or advancement.
I'm finally updating my CV after years of neglect. I'm keen on switching to the route of Org mode -> LaTeX -> PDF.
It's partly because I love the simplicity/power of Org and I do all my writing in it nowadays, the other part is to separate the content from the presentation so I can have the content in two different languages but still end up with the same formatted document for both.
Anyone have experience with this or have favorite LaTeX templates for CVs?
Nice CV. Way too much text but I absolutely love the included diagrams. I think interviewers are probably going to glaze over the text but the diagrams are interesting and they practically beg for questions.
I tried to avoid custom commands and environments to keep it simple. Your content in org text should fit nicely with this.
It also has a template where the preamble is stored in different file such that you can try a different look by just un/commenting a different preamble file.
My own experiment involved writing my CV in YAML, and using a Pandoc template to generate .tex and .pdf. I think I may have overcooked the thing a little, but it was good fun.
Org mode is the swiss army knife of content markup languages. It does a lot more than just content markup. But keep in mind that org-mode, markdown, asciidoc, etc don't afford much control on final layout. They're like plain HTML in function. LaTeX and Typst include more layout control - sort of like HTML with a little bit of CSS. This may not matter if you're preparing something like an article or document. But you may want more layout control for something like a CV.
LaTeX is quite underrated these days. Even though alternatives like Typst are popping up, LaTeX is also pretty convenient and powerful if you get past the crude syntax and obscure compilation errors.
I sill remember my disbelieve when I found out that I can change my article into a presentation just by changing the document class to "beamer".
These days I usually default to pandoc's markdown, mostly because the raw text is very readable.
Please nobody actually do this. Good presentation slides have almost zero overlap with the corresponding article since they serve completely different purposes. In my field, seeing beamer slides is a huge red flag for an imminent terrible presentation. Slides are an extremely visual medium, and WYSIWYM is a huge hindrance for designing appealing slides.
I disagree. LaTeX is very good at layouting test, and can also (reluctantly) put figures into the text. Anything else is a huge hack (like TikZ), and one constantly runs into crazy limitations such as the fixed-point math and the lack of a decent visual editor. Slides should never have paragraphs of text on them, so the layouting is not very useful, but the other limitations are very annoying.
TikZ and Asymptote are more or less the only general-purpose modular illustration markup languages we have around. Anything better is welcome, but graphical editors are not an alternative in some cases.
I wouldn't say underrated. Literally every single research article in maths and cs, every PhD dissertation and master thesis in these fields too, are written in LaTeX.
Most students, and many researchers use Overleaf nowadays, though.
> I wouldn't say underrated. Literally every single research article in maths and cs, every PhD dissertation and master thesis in these fields too, are written in LaTeX.
Usage level is not correlated to "rate". Sometimes people use stuff because they have to, not only because they like it. See the Microsoft Word case.
I'd agree that LaTeX has fell a bit in popularity this days against Typst - but not much in its usage. It is still the de facto standard of scientific and technical document typesetting.
One reason is that many journals supply LaTeX templates. And I find them easier to apply compared to their Word templates. I wonder how much support Typst has from these publishers, considering its relatively young age.
KeenWrite basically transforms Markdown -> X(HT)ML -> TeX -> PDF, although it uses ConTeXt instead of LaTeX for typesetting because ConTeXt makes separating content from presentation a lot easier.
For personal use, maybe Markdown + pandoc, or Typst for more complex stuff. For academical use I don't think there are any, because everything still revolves around LaTeX. But the lack of alternatives doesn't mean LaTeX is pleasant to use.
Check out Tectonic which is an all-in-one LaTeX toolchain (single executable w/ engine + build system) that lazily downloads TeX Live (no upfront multi-gig downloads). It's a breath of fresh air in the chaotic LaTeX landscape. Bit of a shame that they opted for XeTeX rather than LuaTeX though.
LuaTeX is the de facto successor of pdfTeX and is basically a more maintained pdfTeX with Unicode support and Lua scripting, whereas XeTeX has its own engine. In practice, it means that LuaTeX "just works" with most documents while with XeTeX you run into all sorts of weird incompatibilities. Fancy packages that make use of Lua scripting (e.g., graphdrawing) will only work with LuaTeX.
Thanks! I vaguely remembered getting a bit of mixed messages with regards to the two last year when I was looking into Tectonic. I just read a bit into the github issues and it seems like the Tectonic devs happened to fork Xelatex and not Lualatex. https://github.com/tectonic-typesetting/tectonic/issues/158#...
Anyway, what they made works perfectly for me, I luckily don't use any of the fancy graphics packages that use Lua. I use Latex a few times a year at most, and Tectonic just works for me. With my previous Lualatex workflow I had to deal with Tlmgr and that whole ant's nest, figuring out one by one which packages I was missing each recompile.
Seems like the main argument against Xetex in the article you linked is that it is unmaintainted, so it doesn't really apply to Tectonic, but it's a bit frustrating that an opportunity for ecosystem convergence potentially has been missed.
Don't get me wrong, I love LaTeX, having written my PhD thesis in it. But with the current tools, I would use Quarto instead. It's much easier, you can still "inject" LaTeX and it's quicker for less technical collaborators to adapt.
I don't know anything about quarto, but you're missing a lot of useful software if you're limiting yourself to the distro repo - especially Debian stable.
As a matter of principle, i prefer to use really stable software that does not change wantonly, and whose authors took the care to put it into debian.
My 20 year-old .tex documents still compile today. Will the same happen with quarto? (or typst, for that matter?) The fact that they offer no packages in the debian standard distribution signals they have likely succumbed to the awful trend of version churning, where you need to use the last version of the software or else. Thus, probably, in 20 years my documents will be un-compilable. For legacy things like typeset documents, it's reasonable to prefer legacy solutions like latex.
Once quarto and typst have stabilized enough to appear in debian stable, I'll consider them as viable alternatives.
Having experience with digitizing a university textbook in physics by hand, this is a very nice LaTeX guide for everyone interested. One thing worth noting from 2025 perspective that the "default" local setup is most likely going to be VSCode with LaTeX Workshop[1] and LTeX+[2] extensions, and that you should use TeX Live on every platform supported by it (since MiKTeX and friends can lag). Also, use LuaTeX, as it's the officially recommended[3] engine since November 2024.
Just a note on MyST's citations feature as I was researching it this morning: until this ticket [1] is worked on there's one bibliography style and that's it.
It's easier and good enough to just use LyX, a graphical document editor with a bunch of backends and templates, and if you really need to do something special you can still drop down to LaTeX and do your own templating.
It's published under GPL so relatively protected from corporate nuisances. Takes five minutes to teach someone how to mark headlines, add content listing and change document type, then a little more to teach how to add tables and images.
I remember having to learn LaTeX to write my research papers. Probably the worst time I’ve had learning something. As someone who has OCD for making everything consistent, trying to achieve the same in LaTeX made me wanna give up research itself. In fact, now that I think about it, I did give up research due to it.
Wow! Consistency is the reason I took up LaTeX, after suffering a disaster with Word. LaTeX feels dated and isn't flawless, but inconsistency is one complaint that I've never heard about it before.
I'd say it is more about how esoteric LaTeX extensions/plugins felt like when I picked it up (I was just getting into the field back then). I am sure that now, after years of experience, I would be more open to give it a shot again. Unfortunately, I am no longer into research anymore. I did have fun converting my LaTeX resume to Typst though.
Using extensions these days is more like searching a software registry (CTAN in this case) for packages and looking up its API documentation. But your last sentence says a lot. The reason why Typst feels so much more ergonomic is that it resembles modern programming and markup languages. They have the advantage of hindsight. LaTeX markup does indeed feel esoteric, no matter how many times you use it. Perhaps it made more sense in its days.
I used LaTeX for writing my undergraduate thesis (> 1 decade ago). Nowadays, unless I write anything involving complicated math expressions or something fancy like Karnaugh map, chessboard diagram etc etc, most likely LaTeX is overkill. Markdown is more than enough.
I'm looking for something that you can embedd in your own application. LaTeX would be great but it's not really nice to have WEB code in your C application. It's also has a bit troublesome license.
It embeds almost anywhere, including via client-side WASM, and someone even made a nice TypeScript lib [0]. If you dislike `typst`, it even has a package that transpiles LaTeX strings into native typst, which somehow doesn't seem to make `typst` any less fast [1]. WASM plugin magic will do that!
The curious consequence is that the fastest and most portable way to render lightwight LaTeX code might actually be... To transpile LaTeX to embedded `typst`? Sure, sure, not all of LaTeX will map. But from an 80/20 mindset it might just be enough.
The special naming convention is mostly used as a form of elitist gatekeeping; it's a tribal shibboleth. You should be able to figure out whether someone is talking about latex (polymer) or latex (markup) from context, so having a special naming convention is rather pretentious and superfluous. If calling it lay-tek was important, they should've called it laytek. Historically, as I understand it, distinctions between the written and spoken forms of words were used as a form of gatekeeping between elites and commoners. Same goes for arXiv.
The great thing about language is that you can just change things if enough people play along. Call it gif or jif, arxiv or archive, latex or laytek.
They did call it Latek, the X in the name is the Greek letter χ (Chi) and in my opinion, it's not an elitist shibboleth but a way to showcase the advantages of their typesetting system over existing, at the time, methods that didn't invest a lot in rendering non ASCII character sets.
Well it does lack that final -e of archive, and—fun fact!—like LaTeX which I know as ['la:teç], arXiv could plausibly be pronounced [ar'çi:f] in German although I've never heard anyone pronounce that name. So why is it arXiv not at least arXive with an -e?
Eh - the name is a bit of fun. Its fun to form communities with in-jokes and tribal knowledge and silly names. It doesn't hurt anyone. Lets not sacrifice everything joyous at the altar of being friendly to noobs.
Or “lah-tek”, the Wikipedia article doesn’t seem to address which, if either is preferred. And I think Leslie Lamport said that he didn’t want to impose any particular pronunciation.
From a linguist's point of view this is a perfect example of the chaos that ensues when people try to say how things are pronounced without describing the speech sounds using standard vocabulary.
Here are the inept pronunciation instructions on the LaTeX project website:
> «Lah-tech» or «Lay-tech» (to rhyme with «blech» or «Bertolt Brecht»)
The pronunciation of the 'ch' in 'blech' isn't really standardized, so that's not much help. If we go by the German pronunciation of Brecht then the sound should be [ç], i.e. a voiceless palatal fricative. But this seems to be a mistake, as Knuth intended the X in TeX to be [x], i.e. a voiceless velar fricative. In German, [ç] is an allophone of /x/ (conditioned by the preceding vowel), but they are distinct sounds, and Knuth's directions for the pronunciation of TeX unambiguously specify [x]. It seems unlikely that this difference between the X in LaTeX and the X in TeX is intentional, so maybe this was a confused attempt to identify the [x] sound.
Really then, it's anyone's guess how LaTeX is supposed to be pronounced, since no-one with authority to specify has bothered to look up the IPA symbols for the relevant speech sounds. But IMO while [leɪtɛk] is a perfectly common and acceptable pronunciation, it can only really be understood as an anglicization of [leɪtɛx] rather than the canonical pronunciation.
The problems are quite similar, "How do I center a div?" vs "How do I keep this float on this page?" Has latex really modernized? I don't hear a lot about new layouts or style mechanisms.
Most people are probably reading articles online these days, although there is a lot to be said about printing an article to read. It seems to me that adding responsiveness to journal articles instead of using a fixed paper layout regardless of media might be a good improvement for many readers in many situations.
With a fixed medium in mind, you can be extremely particular on where on this canvas you want a piece of text/graphic or whatever.
Without a fixed medium, you have to have logic to address the different mediums and compromises have to be made.
But regardless, I think that, in addition to moving away from Latex we should also reconsider the primary output format. Documents are rarely printed anymore, and inaccessible, fixed-size A4 pdfs are annoying to read on anything but an iPad Pro.
HTML by contrast explicitly does remove control over layout from the author and place it in the hands of the user (and their chosen user agent).
Both languages have mechanisms to (somewhat) separate the content from the formatting rules.
Last time I looked into it, a while ago, my impression was that it would get rickety too soon. It’d be a good place to be, I think, if web and “document” tech stacks could have nice and practical convergence.
> Especially in combination with headless browser pdf generation
I have no idea why you’d want to do that. Browsers are bad at it, dedicated tools are great at it.
[0]: https://weasyprint.org/
[1]: https://print-css.rocks/
[2]: https://pagedjs.org/
Fair! I was just aspiring to a place where web pages and documents converge more.
Thanks for the recommendations!
As an old hand with PDF-in-browser production, I expected much worse of Paged.js than I found. It's powerful and mostly enjoyable to use! Oh, you end up with a large set of CSS rules, and it is not without bugs and gotchas (failing to specify a bleed rule somewhere at least once in every @page context subtly breaks layout; footnote layout is functional but automatic call numbering isn't always perfect, etc.)
You should definitely not expect to take Paged.js out of the box, slap a theme on it, and go; it comes as a box of parts with a mostly complete machine inside, and if it breaks you get to keep all the pieces. I imagine the publisher who uses it must have some prior interest in web technologies, for example.
Nor is Paged.js remotely as capable or flexible as InDesign or a comparable tool, especially for the deeply rudimentary condition of web typography overall - something even as elaborate a tool as this can't really approach fixing.
But Paged.js is also unlike InDesign in having a much shallower (days vs months) learning curve for folks like us with prior web experience, and however equivocal a review I may now be giving of its technical merits, I do actually like working with Paged.js quite a lot.
That said, it's a shame that the relevant W3C specs (see https://pagedjs.org/about/) still aren't fully supported by browsers (but perhaps such is the fate of niche features), but with that being the case, I'm infinitely thankful that pagedjs exists as a polyfill.
In my experience Paged.js is at its best when building to PDF, but then that's always my intermediate format when working to paper, because that's where PDF's inflexibility shines. The source of a book block, everything that builds to that PDF, partakes of all the infelicities of the JS ecosystem. But to remake the book itself again, all I need do to start is print the PDF.
Yes, sometimes, but I would say that one of the benefits of latex is how easy you can switch to another layout. But I guess the point is that you typically render to a set of outputs with fixed dimensions (pdf)
You can change that as you go along.
that's not the point they were trying to make. you may need to change the display target for every viewer.
I don't want to manually type (or read past) HTML tags littered around the place. I don't want to manually put <p> tags on my text, or worry about how indentation will affect my rendered output. (For example, <p>foo</p> and <p> foo </p> render differently).
If I'm writing a blog post, I also don't want my post's text to get mixed up with site specific stuff, like meta tags and layout elements.
Are there any good "literate HTML" type tools which first and foremost let me type text, but still let me break into HTML? That I could get behind.
In other words, SGML is complementing the HTML vocabulary with authoring affordances, as originally intended (HTML is based on it).
At the very least, because those are the two popular software systems used for creating documents. HTML+CSS isn't; and Javascript is irrelevant for print.
pros:
- one small compiler that can output: pdf, png, svg, html
- compilation is fast (see below)
- syntax is much cleaner than Latex
- few ways of to a thing
- already has all the templates most people need
- tooling is good enough with VS Code
- supports SVG images
cons:
- less users?
The real problem is acceptance of non-word/latex papers
Some scientific journals, which only provides a Word template, require you to print to PDF to submit, then ships this PDF to India, where a team recreates the look of the submission in LaTeX, which is then used to compose the actual journal. I wish this was hyperbole. For these journals, you can safely create a LaTeX-template looking _almost_ the same, and get away with it.
So its a mass and momentum problem. Typst not only has to be better/easier/faster than latex, but to a degree that it justifies all of the labor and time to learn it and change all that existing template and utility infrastructure built up over decades. A high bar.
If Typst (or some other new contender) could also read and compile latex code and packages alongside its own syntax then that would be a game-changer. Then I can use all my old stuff and gradually change things over to typst (or whatever).
Typst is a breath of fresh air. Interacting with modern tooling (GitHub, discord). Responsive developers. Easy to read code. Easy to do things on your own.
Admittedly, my use case is mainly writing books, I've never published an academic paper.
My main selling points is that with LaTeX, it is easy to create typography shines beauty for a distance. (Often way better that most of books you find in stores.) With other typesetting systems, usually it is not the case. Yet, I am waiting for new things that offer simplicity, yet have same (or better!) visuals that LaTeX.
Typst apparently uses Knuth-Plass, but I don't see any information about microtypography.
Things like default margins, in my opinion, are a lot easier to fix than these other issues.
https://typst.app/
- the cryptic error messages and infinite logs
- the unintuitive ways to do stuff like store a value for later use or sum two lengths
- the very long compile times
- the amount of reliance on global state from various packages, which contributes to even more cryptic errors or weird behavior when something goes wrong
- various other quirks, e.g. the fact you often need to end a line with a comment or the newline will skrew up your content.
Typst on the other hand is inherently readable.
The non-control characters of ASCII are largely characters you might actually want to put in a document. TeX uses some of these as markup, e.g., the dollar sign to bracket maths mode and the ampersand as a column separator in tables. Typst takes this much further, using plus and minus signs to introduce list items, at signs for references, and so on.
Ideally, all visible characters should produce themselves in the printed output, except for the backslash introducing control sequences that represent all of the markup and braces for delimiting the extent of parameters to those control sequences. This would produce a very predictable and easily-parsed syntax.
As an alternative, tectonic is a bit faster then the standard LaTex distributions:
https://github.com/tectonic-typesetting/tectonic/discussions...
https://tectonic-typesetting.github.io/en-US/
Also, typst is just really good.
Yeah - typst has a bunch of features that I really want for blog posts and rich documentation, where markdown isn't a powerful enough tool. For example:
- Boxes & named figures
- Footnotes
- Variables, functions (incl populated from nearby files)
- Comments
- Chapter / Section headings (& auto generated table of contents)
- Custom formatting rules (For example, typst lets you define your own "warning box". Stuff like that.)
I don't know of a better tool to write my blog posts today. Markdown doesn't have enough features. And I'm obviously not writing blog posts in latex or a rich text editor. I could use actual javascript / JSX or something - but those tools aren't designed well for long form text content. (I don't want to manually add <p> tags around my paragraphs like a savage.)
Pity the html output is still a work in progress. I'm eagerly awaiting it being ready for use!
[^0]: it doesn't matter where this is placed, just that this one has a colon.
The table of contents thing is annoying but it's not hard to write a little bash script. Sed and regex are all you need.
Markdown has too many featuresThe issue is you're using the wrong tool. Markdown is not intended for making fancy documents or blogs, it's meant to be a deadass simple format that can be read in anything. Hell, its goal is to be readable in a text editor so its more about styling. If you really want to use it and have occasional fanciness, you can use html.
But don't turn a tool that is explicitly meant to be simple into something complicated just because it doesn't have enough features. The lack of features is the point.
Yes, I think we're in violent agreement that markdown is the wrong tool for the job. That's why I find it baffling how so many blogging & documentation tools lock you in to using markdown, with its anaemic feature set (eg mdbook).
Even markdown + inline HTML is wildly inadequate. For example, you can't make automatically numbered sections. Or figures with links in the text. Or a ToC. And so on. Try and attach a caption to an image and you're basically hand authoring your document in crappy HTML.
So I agree with you. I don't think the answer is "markdown++" with comments, templating and scripting support. I think the answer is something else. Something which has considered the needs of authoring documents from the start. Something like typst.
Honestly, I think a good portion of it of the unwillingness to toss something aside and write something new. If it's just a hack on a hack on a hack on a hack then no wonder it's shit. It's funny that often it's quicker to rewrite than force your way through.
I'm worried that with LLMs and vibe coding on the rise we're just going to get more. Because people will be asking "how do I make X do Y" when in reality you shouldn't ever make X do Y, you need to find a different tool.
I'm hoping the opposite, at least eventually. I think before long it'll be easy to get chatgpt to build your own version of whatever you want, from scratch.
Eg, "Hey, I want something kinda like markdown but with these other features. Write me the spec. Implement a renderer for documents in Go - and write a vs code extension + language server for it."
But if that happens, we'll get way more fragmentation of the computing ecosystem. Maybe to the point that you really need the memory of a LLM to even know what's out there - let alone understand how to glue everything together.
I filed a bug (this was a few years ago) and I was told commonmark compatibility was an explicit non goal for the project. Meh.
Though I ended being a graphic designer so LaTeX felt rather limiting very quickly, but fortunately found ConTeXt.
Hoped Typst was going to be great for my use case but alas it's got the same "problem" as LaTeX - modularity. Still it seems to be a great alternative for people doing standard documents.
I'm sure you remember that quite clearly.
When you are losing your semester's 25-page seminal work an hour before deadline because Word had that weird little bug about long documents and random CJK characters (and whether or not the moon was currently in the House of Aquarius supposedly), you develop a ... healthy dislike for it.
LaTeX back in the day didn't need zealots - Word did all the heavy lifting in demolishing itself for anything more involved than 'Secretary writes a letter', 'grandma Jones writes down her secret butterball recipe' or 'suits need a text, and only text, on paper, quickly".
(Yes, that was snarky. I am still bitter about that document being eaten.)
In two years I hardly met anyone still doing pure LaTeX publications, unless the publishing body only accepted LaTeX as submission format.
Word still has a closed format. It supposedly standardized OOXML, but - it doesn't follow that standard; Microsoft apparently managed to warp the XML standard to accommodate its weirdness; and all sorts of details encoded by MSO in that format are not actually documented.
There also used to be the problem of different renderings on different machines (even if you had all the relevant fonts installed): You opened a document on another person's computer and things were out-of-place, styling and spacing a bit different, page transitions not at same point etc. I don't know if that's the case today.
Granted, though, hangs and crashes and weird gibberish on opening a document are rare today.
When this happened to me on my job in the late 90s we were able to locate that problem in the printer driver that was visible in the Word print dialog. I don't remember the details but it looked like Word was adjusting font metrics to the metrics of the specific printer, and all the shifted pixels quickly added up to destroy the finely balanced lines of our print publication (yes, an official public health periodical by a European government was typeset with MS Word, and there was a lot of manual typographical work in each print). Given the technology at the time, it's not clear to me whether Word's behavior was a feature (in the sense of: automatically adjusts to your output device for best results) or a bug (automatically destroys your work without asking or telling you when not in its accustomed environment).
I would suspect (based on my own experience) is that the reason folks shout "typst!" anytime they hear latex is that the user experience is 1000x better than latex.
That and Computer Modern. I bet a significant number of users use it because of that!
Personally I would just use LyX. Its equation editor is actually fantastic.
Many people say that they use LaTeX because it produces more beautiful output. Microtypography is one of the reasons for that. It's especially noticeable when microtype pushes hyphens or quotes at the end of a line slightly into the margin. (A nearby comment mentions that Typst has this feature, too.)
It has a lot of good things going for it, but it is the least attractive font that I think I have ever seen.
And there's another microtype PR open, by the reporter of the linked issue (nice!)
The microtype user manual shows how much thought has gone into it: https://mirror.foobar.to/CTAN/macros/latex/contrib/microtype...
I find some stuff like this.. is it raw pdf directives? Literally an example of something typst can't do right now. I also can't read this.
``` \def\mt@toggle@sample#1{% \pdfstartlink user{/Subtype/Link /BS << /Type/Border/W 1 /S/D /D[4 1] >> /H/O /C[0.65 0.04 0.07] /Contents(Click to Toggle #1!) %/OC << /Type/OCMD /VE[/Not \csname mt@_compatibility@\endcsname] >> % not honoured by older viewers anyway /A << /S/SetOCGState /State[/Toggle \csname mt@#1@true\endcsname \csname mt@#1@false\endcsname] >>} #1 \hfill\pdfendlink & \mt@layer{#1true}{\rlap{on}}\mt@layer{#1false}{off}} ```
A lot of things are possible in PDF.
What doesn't work?
Are people looking seriously at shortcomings of latex and moved towards modern replacements?
Major problems include:
- Tables are a huge pain.
- Customized formatting like chapter headings, footers, etc is painful.
- Latex as a language somehow felt like it was having issues with composability of functions, the details of the problem eludes me now, but it was something like if you have a function to make text bold, and if you have another function to make it italic, then if apply one to the output of another, it should give you bold and italic, but such composability was not happening for a some functions.
-Mixing of physical and logical formatting.
-Lot of fine tuning require to get passable final output.
https://typst.app/docs/reference/model/table/
https://typst.app/docs/guides/table-guide/
Ah yes, this definitely is the “Modern” approach.
There does seem to be an open source, non-SAAS part, but information about it looks pretty deliberately buried.
So... yeah.
That is an overly broad generalization.
> no hesitation to pay AWS, Netflix, Amazon, etc.
Again, an overly broad generalization.
I am unsure what kind of conclusion you can objectively make out of such generic statements.
They developed the main face of the product first - the online webapp which has live collaboration - which sounds like a sane choice for a new company.
It does, but this is actually part of the critique. Typst is developed by a company, while LaTeX is not.
I used latex for 20+ years and don't know how to file a bug for latex. Do I do it for xelatex, latex? Where? How do I update things? Download 4 gigs? Where's to documentation? Where's a book that explains how to contribute to latex? These are some of the issues I've dealt with and am happy to never have to again.
LaTeX and Typst are markup primarily for PDF type contents. Something like Asciidoc or even Markdown is more appropriate for HTML type content. You can always switch the purposes, but I never got a satisfying output by doing that.
It's partly because I love the simplicity/power of Org and I do all my writing in it nowadays, the other part is to separate the content from the presentation so I can have the content in two different languages but still end up with the same formatted document for both.
Anyone have experience with this or have favorite LaTeX templates for CVs?
I'm currently experimenting with this:
https://titan-c.gitlab.io/org-cv/
My cv is an adaptation of one of the templates there: https://drive.google.com/file/d/1woxVNcJ4AmT7dD2WEnYr9BHEEY7...
EDIT: ahahahahaha I just came across this cv: https://www.overleaf.com/latex/templates/resume-slash-cv-tem...
I'm totally stealing that.
I tried to avoid custom commands and environments to keep it simple. Your content in org text should fit nicely with this.
It also has a template where the preamble is stored in different file such that you can try a different look by just un/commenting a different preamble file.
I never got into emacs. Is Org worth it?
https://github.com/boerseth/cv
These days I usually default to pandoc's markdown, mostly because the raw text is very readable.
I don't know, if your slides are just a few keywords in a few bullet points and the occasional picture / diagram, WYSIWYM is great.
I agree that you shouldn't turn an actual article into a presentation though.
LaTeX has all the tooling to write high-quality ones.
Most students, and many researchers use Overleaf nowadays, though.
Usage level is not correlated to "rate". Sometimes people use stuff because they have to, not only because they like it. See the Microsoft Word case.
I'd agree that LaTeX has fell a bit in popularity this days against Typst - but not much in its usage. It is still the de facto standard of scientific and technical document typesetting.
Perhaps it's a programmer thing.
Don't you need to insert tons of `frame` environments to get anything worth looking at?
Have you considered writing pandoc-style Markdown that's converted to TeX for typesetting? If not, have a peek at my text editor:
* https://keenwrite.com/screenshots.html
* https://www.youtube.com/playlist?list=PLB-WIt1cZYLm1MMx2FBG9... (see tutorials 4 and 9)
KeenWrite basically transforms Markdown -> X(HT)ML -> TeX -> PDF, although it uses ConTeXt instead of LaTeX for typesetting because ConTeXt makes separating content from presentation a lot easier.
[0]: https://github.com/tectonic-typesetting/tectonic
Edit: Looks like it's not de facto anymore, and LuaTeX is now recommended for all documents and XeTeX is being recommended _against_ (https://www.texdev.net/2024/11/05/engine-news-from-the-latex...)
Anyway, what they made works perfectly for me, I luckily don't use any of the fancy graphics packages that use Lua. I use Latex a few times a year at most, and Tectonic just works for me. With my previous Lualatex workflow I had to deal with Tlmgr and that whole ant's nest, figuring out one by one which packages I was missing each recompile.
Seems like the main argument against Xetex in the article you linked is that it is unmaintainted, so it doesn't really apply to Tectonic, but it's a bit frustrating that an opportunity for ecosystem convergence potentially has been missed.
My 20 year-old .tex documents still compile today. Will the same happen with quarto? (or typst, for that matter?) The fact that they offer no packages in the debian standard distribution signals they have likely succumbed to the awful trend of version churning, where you need to use the last version of the software or else. Thus, probably, in 20 years my documents will be un-compilable. For legacy things like typeset documents, it's reasonable to prefer legacy solutions like latex.
Once quarto and typst have stabilized enough to appear in debian stable, I'll consider them as viable alternatives.
https://github.com/quarto-dev/quarto-cli/releases/tag/v1.8.1
[1] https://marketplace.visualstudio.com/items?itemName=James-Yu...
[2] https://marketplace.visualstudio.com/items?itemName=ltex-plu...
[3] https://www.texdev.net/2024/11/05/engine-news-from-the-latex...
It's a document engine that ingests Markdown (particularly the MyST superset) and builds upon "structured data" for sharing.
E.g. SciPy's proceedings: https://proceedings.scipy.org/articles/XHDR4700
1. https://github.com/jupyter-book/mystmd/issues/1462
https://www.lyx.org/
It's published under GPL so relatively protected from corporate nuisances. Takes five minutes to teach someone how to mark headlines, add content listing and change document type, then a little more to teach how to add tables and images.
I've migrated all of my latex (book layout and invoicing) usage to typst and couldn't be happier.
It embeds almost anywhere, including via client-side WASM, and someone even made a nice TypeScript lib [0]. If you dislike `typst`, it even has a package that transpiles LaTeX strings into native typst, which somehow doesn't seem to make `typst` any less fast [1]. WASM plugin magic will do that!
The curious consequence is that the fastest and most portable way to render lightwight LaTeX code might actually be... To transpile LaTeX to embedded `typst`? Sure, sure, not all of LaTeX will map. But from an 80/20 mindset it might just be enough.
- [0] https://github.com/Myriad-Dreamin/typst.ts - [1] https://typst.app/universe/package/mitex/
[1] https://en.wikipedia.org/wiki/XeTeX
[2] https://en.wikipedia.org/wiki/LuaTeX
> Feel free to try lualatex instead—there are a few differences between the two that we will discuss later, but either is fine for now
Knuth & friends were on a roll naming things - the 80s must've been quite a time.
As always wiki knows all: https://en.wikipedia.org/wiki/LaTeX
Knuth, on the other hand, has a whole rationale on why it’s pronounced “tech”. (“Your keyboard should become slightly moist”, iirc).
The great thing about language is that you can just change things if enough people play along. Call it gif or jif, arxiv or archive, latex or laytek.
The "La" in latex is Leslie Lamport. https://lamport.azurewebsites.net/
Or “lah-tek”, the Wikipedia article doesn’t seem to address which, if either is preferred. And I think Leslie Lamport said that he didn’t want to impose any particular pronunciation.
Here are the inept pronunciation instructions on the LaTeX project website:
> «Lah-tech» or «Lay-tech» (to rhyme with «blech» or «Bertolt Brecht»)
The pronunciation of the 'ch' in 'blech' isn't really standardized, so that's not much help. If we go by the German pronunciation of Brecht then the sound should be [ç], i.e. a voiceless palatal fricative. But this seems to be a mistake, as Knuth intended the X in TeX to be [x], i.e. a voiceless velar fricative. In German, [ç] is an allophone of /x/ (conditioned by the preceding vowel), but they are distinct sounds, and Knuth's directions for the pronunciation of TeX unambiguously specify [x]. It seems unlikely that this difference between the X in LaTeX and the X in TeX is intentional, so maybe this was a confused attempt to identify the [x] sound.
Really then, it's anyone's guess how LaTeX is supposed to be pronounced, since no-one with authority to specify has bothered to look up the IPA symbols for the relevant speech sounds. But IMO while [leɪtɛk] is a perfectly common and acceptable pronunciation, it can only really be understood as an anglicization of [leɪtɛx] rather than the canonical pronunciation.