Abstract: This study examines logit models applied to the truck route choice problem using GPS trucking data from the Dallas metropolitan area. Instead of assuming a constant coefficient for each ...
Abstract: Knowledge Distillation (KD) is an effective model compression approach to transfer knowledge from a larger network to a smaller one. Existing state-of-the-art methods mostly focus on feature ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results