← Previous · All Episodes · Next →
MITS: Enhanced Tree Search Reasoning for LLMs via Pointwise Mutual Information Episode 1238

MITS: Enhanced Tree Search Reasoning for LLMs via Pointwise Mutual Information

· 25:31

|

🤗 Upvotes: 35 | cs.AI

Authors:
Jiaxi Li, Yucheng Shi, Jin Lu, Ninghao Liu

Title:
MITS: Enhanced Tree Search Reasoning for LLMs via Pointwise Mutual Information

Arxiv:
http://arxiv.org/abs/2510.03632v1

Abstract:
Tree search has become as a representative framework for test-time reasoning with large language models (LLMs), exemplified by methods such as Tree-of-Thought and Monte Carlo Tree Search that explore multiple reasoning paths. However, it remains difficult to provide instant and reliable quantitative assessments of intermediate reasoning step quality, and extensive path exploration is computationally costly. To address this, we propose Mutual Information Tree Search (MITS), a novel framework that guides reasoning with information-theoretic principles. MITS introduces an effective scoring function based on pointwise mutual information (PMI), which enables step-wise evaluation of reasoning paths and search tree expansion via beam search without expensive look-ahead simulations, achieving superior reasoning performances while maintaining computational efficiency. The framework is complemented by an entropy-based dynamic sampling strategy that adaptively allocates computational resources to uncertain reasoning steps where exploration is most beneficial. For final prediction, MITS employs a weighted voting scheme that combines PMI scores with prediction consensus. Through comprehensive experiments on diverse reasoning benchmarks, MITS consistently surpasses baseline methods, establishing a principled and efficient framework for LLM reasoning.


Subscribe

Listen to Daily Paper Cast using one of many popular podcasting apps or directories.

Apple Podcasts Spotify Overcast Pocket Casts YouTube
← Previous · All Episodes · Next →