Navigation

    Quantiacs Community

    • Register
    • Login
    • Search
    • Categories
    • News
    • Recent
    • Tags
    • Popular
    • Users
    • Groups

    Machine Learning - LSTM strategy seems to be forward-looking

    General Discussion
    5
    6
    723
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • B
      black.magmar last edited by black.magmar

      Hi,
      Your Machine Learning - LSTM strategy seems to be forward-looking. You train the model using feature_for_learn_df, then you calculate the prediction for features_cur (with timestamp in the past), and then you use the predictions as the weights for those same past timestamps (weights.loc[dict(asset=asset_name, time=features_cur.time.values)] = prediction). In this way you have weights as values from a model that has seen values from the future (from the point of view of the timestamp of the weight).
      Thank you

      V C 2 Replies Last reply Reply Quote 1
      • V
        Vyacheslav_B @black.magmar last edited by

        @black-magmar Hello. Perhaps this seems strange, but not entirely so.

        Notice how the target classes are derived — a shift into the future is used.
        For the last available date in this data, there are no target classes.

        In each iteration, the backtester operates with the latest forecast. Although the entire series is forecasted, only the last value without an available target class is relevant.

        This can be verified by running the function in a single-pass mode and examining the final forecast.

        I assume this is done in such a way that the strategy can be run in both single-pass and multi-pass modes.

        I became curious, so I will additionally verify what I have written to you.

        3e7d97d8-f0f2-4218-8dbf-20a70b970905-image.png

        B 1 Reply Last reply Reply Quote 0
        • B
          black.magmar @Vyacheslav_B last edited by

          Thank you for your prompt reply. The model is trained correctly, by shifting the label/target column, but if you use the predictions (for the training set timestamps) as the weights for those timestamps, they use information that would not be available in reality.
          Because the training set contains data up to the latest timestamps for the slice, while any weight should be derived by looking exclusively at previous timestamps data.
          However, I do not know the details of how the qnbt.backtest_ml function does slice the data, but if it uses the weights as they are returned from the predict(models, data) function, then they might be forward-looking. I will look at the single-pass version , that may be different.

          M support 2 Replies Last reply Reply Quote 0
          • M
            multi_byte.wildebeest @black.magmar last edited by

            This post is deleted!
            1 Reply Last reply Reply Quote 1
            • support
              support @black.magmar last edited by

              @black-magmar You are correct, but this kind of forward-looking is always present when you have all the data at your disposal. The important point is that there is no forward-looking in the live results, and that should not happen as the prediction will be done for a day for which data are not yet available.

              1 Reply Last reply Reply Quote 0
              • C
                candlelightcells @black.magmar last edited by

                This post is deleted!
                1 Reply Last reply Reply Quote 0
                • First post
                  Last post
                Powered by NodeBB | Contributors
                • Documentation
                • About
                • Career
                • My account
                • Privacy policy
                • Terms and Conditions
                • Cookies policy
                Home
                Copyright © 2014 - 2021 Quantiacs LLC.
                Powered by NodeBB | Contributors