10 comments

  • EmilStenstrom 1 hour ago
    I somehow find the concept of a general time series model strange. How can the same model predict egg prices in Italy, and global inflation in a reliable way?

    And how would you even use this model, given that there are no explanations that help you trust where the prediction comes from…

    • teruakohatu 1 hour ago
      What is not generally understood is that these models don’t predict egg prices or inflation in Italy.

      They decompose a time series into trends, seasonality and residuals. That’s what they are actually modelling.

      They cannot predict wars in the Middle East influencing inflation unless there is a seasonal pattern(s).

      • perks_12 2 minutes ago
        I am not familiar with time series models, but judging from your answer, it would be necessary to feed long time series into this model for it to detect trends. What is a token here? Can it, for the lack of a better example, take in all intraday movements of a stock for a day, a week, a month, etc?
      • cybrox 1 hour ago
        Wars in the middle east seem to have increasingly regular patterns tied to stock market opening hours, unfortunately.
        • jofzar 6 minutes ago
          I mean it's super obvious, it's directly tied to scrubs popularity.

          New season of scrubs = new war in the middle east.

        • rubyn00bie 33 minutes ago
          I totally agree with the sentiment but from what I can tell, I’d say they tend happen immediately before or after markets open and close. Essentially, and to their maximum, screwing absolutely everyone who isn’t in the clique from participating in the trade.

          FWIW— the only sure fire way to win the trade is to buy time and assume both gross incompetence and negligence when it comes action. The only caveat is if the markets tank enough, this administration will signal capitulation before hand, e.g. Trump mildly capitulating on tariffs last April after the markets proceed to relentlessly defecate themselves.

          0-DTE options are typically, and for good reason, stupid gambles. But, right now they can’t even be considered gambling, because there’s zero chance of winning. Not just bad odds, but no odds. Again just signaling how truly malicious this admin is and its disdain for anyone and everyone not close to them.

      • visarga 1 hour ago
        ARIMA and ARMA models
      • d--b 1 hour ago
        The main issue is that people do use them to predict bitcoin prices intraday and that sort of things.
        • nico 57 minutes ago
          Is it an issue because it works, or because it doesn’t? Or because it’s bitcoin?

          I genuinely want to know. Thank you

      • pasanhk 1 hour ago
        [dead]
    • lovelearning 1 hour ago
      My understanding is that the synthetic training data helps capture abstract time-series patterns that are common in all domains.

      As they say in appendix 8:

      > We create the synthetic data to reflect common time-series patterns using traditional statistical models. We start with four simple times series patterns:

      > • Piece-wise linear trends (I), where the number of the piece-wise linear components is randomly chosen between 2 and 8.

      > • ARMA(p, q) (II), where 1 ≤ p, q ≤ 8 and the corresponding coefficients are generated from either a multivariate Gaussian or a uniform, then normalized.

      > • Seasonal patterns. In particular we create the sine (III) and the cosine (IV) waves of different random periods between 4 and max context length / 2 time-points and time delays.

      If there were no such underlying patterns in the class of all time-series data, then even the idea of traditional time-series models would be fundamentally misplaced.

      And since this is a transformer model, it also looks for patterns in the problem-specific input data at inference time, just like how the input context to an LLM influences its output's relevance.

    • eru 35 minutes ago
      > How can the same model predict egg prices in Italy, and global inflation in a reliable way?

      How can the same lossy compression algorithm (eg JPG) compress pictures of everything in a reliable way?

      • cenamus 32 minutes ago
        It can't compress pictures of everything in a reliable way.

        Text and anything with lots of high frequency components looks terrible

    • benob 1 hour ago
      I would say:

      - decomposition: discover a more general form of Fourrier transform to untangle the underlying factors

      - memorization: some patterns are recurrent in many domains such as power low

      - multitask: exploit cross-domain connections such as weather vs electricity

  • dash2 58 minutes ago
    So the time series are provided with no context? It's just trained on lots of sets of numbers? Then you give it a new set of numbers and it guesses the rest, again with no context?

    My guess as to how this would work: the machine will first guess from the data alone if this is one of the categories it has already seen/inferred (share prices, google trend cat searches etc.) Then it'll output a plausible completion for the category.

    That doesn't seem as if it will work well for any categories outside the training data. I would rather just use either a simple model (ARIMA or whatever) or a theoretically-informed model. But what do I know.

    • Tarq0n 3 minutes ago
      If it works for predicting the next token in a very long stream of tokens, why not. The question is what architecture and training regimen it needs to generalize.
  • wiradikusuma 1 hour ago
  • EmilStenstrom 1 hour ago
    Here is the link to the blogpost, that actually describe what this is: https://github.com/google-research/timesfm?tab=readme-ov-fil...
  • emsign 5 minutes ago
    Can this finally break the stock markets?
  • ra 59 minutes ago
    This has been around a few months now, has anyone built anything on it?
  • raghavMultilipi 15 minutes ago
    This has been around a few months now, has anyone built anything on it?
    • magimas 2 minutes ago
      we did some internal tests. The quality isn't bad, it works quite well. But it's essentially on the same level of an ARIMA model trained on the data just much bigger and slower.

      So in my opinion it currently falls into a kind of void. If your use case is worth predicting and you put a data scientist on it, you're better off just training cheaper ARIMA models.

  • Foobar8568 1 hour ago
    Somehow I missed that one. Are there any competition on this?

    I always had difficulties with ML and time series, I'll need to try that out.

  • jdthedisciple 59 minutes ago
    Let me be blunt: Shannon would tell us that time forecasting is bullshit:

    There is infinitely more entropy in the real world out there than any model can even remotely capture.

    The world is not minecraft.

    • mikkom 47 minutes ago
      Yeah all weather forecasts are just magic
      • tgv 13 minutes ago
        Weather forecasts are notoriously iffy, and accuracy drops with time, but we understand the physics behind it (to a large extent). There's also a lot of fine-grained data available. For some arbitrary time series, there's only one data sequence, and the model is unknown. Extrapolation then becomes a lot more magical.
      • kgwgk 17 minutes ago
        Whether forecasting is simple: it either rains or it doesn’t. 50/50 probability!
      • eru 34 minutes ago
        And JPG doesn't work either..
  • charlotte12345 10 minutes ago
    [dead]