Convergence of the mini-batch SIHT algorithm

dc.contributor.authorDamadi, Saeed
dc.contributor.authorShen, Jinglai
dc.date.accessioned2024-01-23T19:01:38Z
dc.date.available2024-01-23T19:01:38Z
dc.date.issued2024-01-10
dc.descriptionIntelligent Systems Conference - IntelliSys 2023
dc.description.abstractThe Iterative Hard Thresholding (IHT) algorithm has been considered extensively as an effective deterministic algorithm for solving sparse optimizations. The IHT algorithm benefits from the information of the batch (full) gradient at each point and this information is a crucial key for the convergence analysis of the generated sequence. However, this strength becomes a weakness when it comes to machine learning and high dimensional statistical applications because calculating the batch gradient at each iteration is computationally expensive or impractical. Fortunately, in these applications the objective function has a summation structure that can be taken advantage of to approximate the batch gradient by the stochastic mini-batch gradient. In this paper, we study the mini-batch Stochastic IHT (SIHT) algorithm for solving the sparse optimizations. As opposed to previous works where increasing and variable mini-batch size is necessary for derivation, we fix the mini-batch size according to a lower bound that we derive and show our work. To prove stochastic convergence of the objective value function we first establish a critical sparse stochastic gradient descent property. Using this stochastic gradient descent property we show that the sequence generated by the stochastic mini-batch SIHT is a supermartingale sequence and converges with probability one. Unlike previous work we do not assume the function to be a restricted strongly convex. To the best of our knowledge, in the regime of sparse optimization, this is the first time in the literature that it is shown that the sequence of the stochastic function values converges with probability one by fixing the mini-batch size for all steps.
dc.description.urihttps://link.springer.com/chapter/10.1007/978-3-031-47721-8_15
dc.format.extent8 pages
dc.genrebook chapters
dc.genreconference papers and proceedings
dc.genrepreprints
dc.identifier.citationDamadi, Saeed, and Jinglai Shen. “Convergence of the Mini-Batch SIHT Algorithm.” In Intelligent Systems and Applications, edited by Kohei Arai, 223–33. Lecture Notes in Networks and Systems. Cham: Springer Nature Switzerland, 2024. https://doi.org/10.1007/978-3-031-47721-8_15.
dc.identifier.urihttps://doi.org/10.1007/978-3-031-47721-8_15
dc.identifier.urihttp://hdl.handle.net/11603/31445
dc.language.isoen_US
dc.publisherSpringer
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Mathematics Department Collection
dc.relation.ispartofUMBC Faculty Collection
dc.relation.ispartofUMBC Student Collection
dc.rightsThis item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
dc.titleConvergence of the mini-batch SIHT algorithm
dc.typeText
dcterms.creatorhttps://orcid.org/0000-0003-2172-4182

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
2209.14536.pdf
Size:
192.94 KB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2.56 KB
Format:
Item-specific license agreed upon to submission
Description: