TITLE: Revenue Management Under Markov Chain Choice Model

ABSTRACT:

A recent choice model is based on modeling the customer choice process through a Markov chain. In this choice model, a customer arrives into the system with the intention of purchasing a particular product. If this product is available for purchase, then the customer purchases it and leaves the system. Otherwise, the customer transitions to another product according to a Markov chain and considers purchasing the other product. In this way, the customer transitions between the products until she reaches a product that is available for purchase or she decides to leave the system without purchasing anything. In this talk, we consider revenue management problems when customers choose according to the Markov chain choice model. For single-leg revenue management, we study the dynamic programming formulation of the problem. We show that the efficient offer sets are nested and the optimal policy can be characterized by nested protection levels. For network revenue management, we study a deterministic linear program that offers each subset of products with a certain probability. In the deterministic linear program, there is one decision variable for each subset of products. Thus, the number of decision variables grows exponentially fast with the number of products, and it is common to solve the deterministic linear program through column generation. We show that if the customers choose according to the Markov chain choice model, then the deterministic linear program can immediately be reduced to an equivalent one whose numbers of decision variables and constraints grow only linearly with the number of products. (This work is joint with Jacob Feldman.)