Systematic Long Short

Systematic Long Short

How To Run An Amazing Forward Feature Selection Technique

Systematic Long Short's avatar
Systematic Long Short
Jan 12, 2026
∙ Paid

Introduction

If you’ve used machine learning to solve problems, one of the common questions you will have is how to do feature selection? There are a few families, and today we are covering a near-optimal variant of sequential feature selection.

Sequential Forward Selection (SFS) builds a feature subset one feature at a time, which is a powerful technique but not feasible when working with an enormous family of features AND has a few inefficiencies when it comes to path dependency in feature selection!

The algorithm we are discussing today is a sequential feature selection with floating backtrack that achieves near-optimal results while evaluating only 1% of subsets compared to a naive search AND lets you correct early greedy mistakes, a capability that basic forward selection lacks. This results in a 20% improvement in criterion values compared to the base sequential search.

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2026 Systematic Long Short · Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture