Dataset Viewer
Auto-converted to Parquet Duplicate
set
stringclasses
1 value
id
stringlengths
5
9
chunk_text
stringlengths
1
115k
chunk_num_tokens
int64
1
106k
document_num_tokens
int64
58
521k
document_language
stringclasses
2 values
train
0.0.0
\begin{document} \begin{abstract} We prove the Liv\v{s}ic Theorem for H\"{o}lder continuous cocycles with values in Banach rings. We consider a transitive homeomorphism ${\ensuremath{\mathbf{\sigma}}:X\to X}$ that satisfies the Anosov Closing Lemma, and a H\"{o}lder continuous map ${a:X\to B^\times}$ from a compa...
3,893
10,708
en
train
0.0.1
\section{Proof of the Theorem \ref{t3}} The following result proven in \cite[Proposition 4.2]{MK} will be used. \begin{lemman}[A. Karlsson, G. A. Margulis]\ensuremath{\lambda}bel{MK} Let $\ensuremath{\mathbf{\sigma}}:X\to X$ be a measurable map, $\mu$ an ergodic measure, $s(n,x)$ a subadditive cocycle. For any $\ensu...
3,863
10,708
en
train
0.0.2
\begin{lemma}\ensuremath{\lambda}bel{l7} Let $\ensuremath{\mathbf{\sigma}}:X\to X$ be a homeomorphism. For any $\ensuremath{\varepsilon},\delta>0$ let $P_{\ensuremath{\epsilon},\delta}$ be the set of points $x$ in $X$ for which there is an integer number $N=N(x,\ensuremath{\epsilon},\delta)$ such that if $n>N$ then ...
1,867
10,708
en
train
0.0.3
\section{Proof of the Main Theorem } After Theorem \ref{t3} is established we can use Corollary \ref{c3} to show that the growth of $\|a(n,x)\|$ is sub-exponential. It allows to use the idea of the original Liv\v{s}ic proof for cocycles with values in Banach rings. H.Bercovici and V.Nitica in \cite{BN} (Theorem 3....
1,085
10,708
en
train
0.1.0
\begin{document} \pagenumbering{gobble} \begin{titlepage} \title{Sensitivity Oracles for All-Pairs Mincuts} \author{ Surender Baswana\thanks{Department of Computer Science \& Engineering, IIT Kanpur, Kanpur -- 208016, India, sbaswana@cse.iitk.ac.in} \and Abhyuday Pandey\thanks{Department of Computer Science \&...
1,068
1,068
en
train
0.2.0
\begin{document} \mainmatter \title{Learning with a Drifting Target Concept} \titlerunning{Learning with a Drifting Target Concept} \author{Steve Hanneke \and Varun Kanade \and Liu Yang} \authorrunning{Steve Hanneke, Varun Kanade, and Liu Yang} \institute{Princeton, NJ USA.\\ \email{steve.hanneke@gmail.com} \an...
2,164
33,201
en
train
0.2.1
\section{Definitions and Notation} \label{sec:definitions} Formally, in this setting, there is a fixed distribution $\mathcal{P}$ over the instance space $\mathcal X$, and there is a sequence of independent $\mathcal{P}$-distributed unlabeled data $X_{1},X_{2},\ldots$. There is also a concept space $\mathbb C$, and a s...
3,547
33,201
en
train
0.2.2
\section{Adapting to Arbitrarily Varying Drift Rates} \label{sec:general} This section presents a general bound on the error rate at each time, expressed as a function of the rates of drift, which are allowed to be \emph{arbitrary}. Most-importantly, in contrast to the methods from the literature discussed above, the m...
205
33,201
en
train
0.2.3
\subsection{Adapting to a Changing Drift Rate} \label{sec:adaptive-varying-rate} Recall that the method yielding \eqref{eqn:hl94} (based on the work of \cite{helmbold:94}) required access to the sequence $\mathcal Deltaseq$ of changes to achieve the stated guarantee on the expected number of mistakes. That method is ...
3,466
33,201
en
train
0.2.4
Let us denote \begin{equation*} \tilde{m}_{T} = \mathop{\rm argmin}_{m \in \{1,\ldots,T-1\}} \frac{1}{m} \sum_{i=T-m}^{T-1} \sum_{j=i+1}^{T} \mathcal Delta_{j} + \frac{d {\rm Log}(m/d) + {\rm Log}(1/\delta)}{m}. \end{equation*} Note that, for any $m^{\prime} \in \{1,\ldots,T-1\}$ and $\delta \in (0,1)$, if $\tilde{m}_...
2,792
33,201
en
train
0.2.5
\end{proof}
5
33,201
en
train
0.2.6
\subsection{Conditions Guaranteeing a Sublinear Number of Mistakes} \label{sec:sublinear} \input{tex-files/sublinear.tex}
38
33,201
en
train
0.2.7
\section{Polynomial-Time Algorithms for Linear Separators} \label{sec:halfspaces} In this section, we suppose $\mathcal Delta_{t} = \mathcal Delta$ for every $t \in \mathbb{N}$, for a fixed constant $\mathcal Delta > 0$, and we consider the special case of learning homogeneous linear separators in $\mathbb{R}^{k}$ und...
709
33,201
en
train
0.2.8
\subsection{An Improved Guarantee for a Polynomial-Time Algorithm} \label{sec:efficient-linsep} We have the following result. \begin{theorem} \label{thm:linsep-uniform} When $\mathbb C$ is the space of homogeneous linear separators (with $d \geq 4$) and $\mathcal{P}$ is the uniform distribution on the surface of the...
3,010
33,201
en
train
0.2.9
Next, we consider the execution of ${\rm ABL}(t,\tilde{h})$, and let the sets $W_{k}$ be as in that execution. We will denote by $w^{*}$ the weight vector with $\|w^{*}\|=1$ such that $h_{t+m_{0}+1}^{*} = h_{w^{*}}$. Also denote by $M_{1} = M-m_{0}$. The proof relies on a few results proven in the work of \cite{awasth...
3,642
33,201
en
train
0.2.10
Next, note that because $h_{w_{k}}(x) \neq y \Rightarrow \ell_{\tau_{k}}(y (v_{k} \cdot x)) \geq 1$, and because (as proven above) $\|w^{*} - w_{k-1}\| \leq r_{k}$, \begin{equation*} |W_{k}| {\rm er}_{W_{k}}( h_{w_{k}} ) \leq \sum_{(x,y) \in W_{k}} \ell_{\tau_{k}}(y (v_{k} \cdot x)) \leq \sum_{(x,y) \in W_{k}} \ell_{...
3,040
33,201
en
train
0.2.11
Lemma~\ref{lem:vc-ratio} (applied under the conditional distribution given $|W_{k}|$) and the law of total probability imply that with probability at least $1-\delta_{k}/3$, \begin{align*} |W_{k}| &\mathcal{P}\left( x : h_{w_{k}}(x) \neq h_{w^{*}}(x) \Big| |w_{k-1} \cdot x| \leq b_{k-1}\right) \\ & \leq \sum_{(x,y) \in...
2,565
33,201
en
train
0.2.12
\end{proof}
5
33,201
en
train
0.2.13
\begin{proof}[Proof of Theorem~\ref{thm:linsep-uniform}] We begin with the bound on the error rate. If $\mathcal Delta > \frac{\pi^{2}}{400 \cdot 2^{27} (d+\ln(4/\delta))}$, the result trivially holds, since then $1 \leq \frac{400 \cdot 2^{27}}{\pi^{2}} \sqrt{\mathcal Delta (d+\ln(4/\delta))}$. Otherwise, suppose $\mat...
2,626
33,201
en
train
0.2.14
\section{General Results for Active Learning} \label{sec:general-active} As mentioned, the above results on linear separators also provide results for the number of queries in \emph{active learning}. One can also state quite general results on the expected number of queries and mistakes achievable by an active learn...
3,794
33,201
en
train
0.2.15
We are now ready for the proof of Theorem~\ref{thm:general-active}. \begin{proof}[Proof of Theorem~\ref{thm:general-active}] Fix any $i \in \mathbb{N}$, and consider running ${\rm Active}(M(i-1))$. Since $h^{*}_{M(i-1)+1} \in \mathbb C$, by Lemma~\ref{lem:active-subroutine}, a union bound, and induction, with probabi...
1,593
33,201
en
train
0.3.0
\begin{equation}gin{document} \date{} \title{ON THE UNIVERSALITY OF SOME SMARANDACHE LOOPS OF BOL-MOUFANG TYPE \footnote{2000 Mathematics Subject Classification. Primary 20NO5 ; Secondary 08A05.} \thanks{{\bf Keywords and Phrases :} Smarandache quasigroups, Smarandache loops, universality, $f,g$-principal isotopes}} \...
3,655
25,145
en
train
0.3.1
\section{Main Results} \subsection*{Universality of Smarandache Loops} \begin{equation}gin{myth}\leftarrowbel{1:4} A Smarandache quasigroup is universal if all its $f,g$-principal isotopes are Smarandache $f,g$-principal isotopes. \end{myth} {\bf Proof}\\ Let $(G,\oplus)$ be a Smarandache quasigroup with a S-subquasigr...
3,528
25,145
en
train
0.3.2
The proof of the converse is as follows. If a SRBL(SLBL) $(G,\oplus )$ is universal then every isotope $(H,\otimes)$ is an SRBL(SLBL) i.e there exists an S-RB(LB)-subloop $(S,\otimes )$ in $(H,\otimes )$. Let $(G,\circ )$ be the $f,g$-principal isotope of $(G,\oplus)$, then by Corollary~\ref{1:2}, $(G,\circ)$ is an SRB...
4,067
25,145
en
train
0.3.3
Again, for an SM-subloop $(S,\circ)$, \begin{equation}gin{displaymath} (x\circ y)\circ (z\circ x)=x\circ [(y\circ z)\circ x] ~\forall~x,y,z\in S\end{displaymath} where \begin{equation}gin{displaymath} x\circ y=xR_g^{-1}\oplus yL_f^{-1}~\forall~x,y\in S. \end{displaymath} Thus, \begin{equation}gin{displaymath} (xR_g^{-1...
3,885
25,145
en
train
0.3.4
The proof of the converse is as follows. If a SEL $(G,\oplus )$ is universal then every isotope $(H,\otimes)$ is an SEL i.e there exists an SE-subloop $(S,\otimes )$ in $(H,\otimes )$. Let $(G,\circ )$ be the $f,g$-principal isotope of $(G,\oplus)$, then by Corollary~\ref{1:2}, $(G,\circ)$ is an SEL with say an SE-subl...
3,970
25,145
en
train
0.3.5
Conversely, if $(G,\oplus)$ is SLBL, then there exists a SLB-subloop $(S,\oplus )$ in $(G,\oplus)$. If $(G,\circ )$ is an arbitrary $f,g$-principal isotope of $(G,\oplus)$, then by Lemma~\ref{1:3}, $(S,\circ )$ is a subloop of $(G,\circ)$ if $(S,\circ )$ is a Smarandache $f,g$-principal isotope of $(S,\oplus )$. Let ...
3,108
25,145
en
train
0.3.6
Conversely, if $(G,\oplus)$ is SML, then there exists a SM-subloop $(S,\oplus )$ in $(G,\oplus)$. If $(G,\circ )$ is an arbitrary $f,g$-principal isotope of $(G,\oplus)$, then by Lemma~\ref{1:3}, $(S,\circ )$ is a subloop of $(G,\circ)$ if $(S,\circ )$ is a Smarandache $f,g$-principal isotope of $(S,\oplus )$. Let us ...
2,932
25,145
en
train
0.4.0
\begin{document} \newcounter{algnum} \newcounter{step} \newtheorem{alg}{Algorithm} \newenvironment{algorithm}{\begin{alg}\mathcal End{alg}} \mathcal Title[Joint spectral radius, Sturmian measures, finiteness conjecture] {Joint spectral radius, Sturmian measures, and the finiteness conjecture} \author{O.~Jenkinson \...
732
61,706
en
End of preview. Expand in Data Studio

MathPile ArXiv (subset)

Description

This dataset consists of a toy subset of 8834 (5000 training + 3834 testing) TeX files found in the arXiv subset of MathPile, used for testing. You should not use this dataset. Training and testing sets are already split

Source

The data was obtained from the training + validation portion of the arXiv subset of MathPile.

Format

  • Given as JSONL files of JSON dicts each containing the single key: "text"

Usage

  • LaTeX stuff idk

License

The original data is subject to the licensing terms of the arXiv. Users should refer to the arXiv's terms of use for details on permissible usage.

Downloads last month
8