Title: A Relational Gradient Descent Algorithm For Support Vector Machine Training
Abstract: Previous chapter Next chapter Full AccessProceedings Symposium on Algorithmic Principles of Computer Systems (APOCS)A Relational Gradient Descent Algorithm For Support Vector Machine TrainingMahmoud Abo-Khamis, Sungjin Im, Benjamin Moseley, Kirk Pruhs, and Alireza SamadianMahmoud Abo-Khamis, Sungjin Im, Benjamin Moseley, Kirk Pruhs, and Alireza Samadianpp.100 - 113Chapter DOI:https://doi.org/10.1137/1.9781611976489.8PDFBibTexSections ToolsAdd to favoritesExport CitationTrack CitationsEmail SectionsAboutAbstract We consider gradient descent like algorithms for Support Vector Machine (SVM) training when the data is in relational form. For relational data the gradient of the SVM objective cannot be efficiently computed by known techniques as it suffers from the "subtraction problem". We first show that the subtraction problem cannot be surmounted by showing that computing any constant approximation of the gradient of the SVM objective function is #P-hard, even for acyclic joins. However, we circumvent the subtraction problem by restricting our attention to stable instances, which intuitively are instances where a nearly optimal solution remains nearly optimal if the points are perturbed slightly. We give an efficient algorithm that computes a "pseudo-gradient" that guarantees convergence for stable instances at a rate comparable to that achieved by using the actual gradient. We believe that our results suggest that this sort of stability analysis would likely yield useful insight in the context of designing algorithms on relational data for other learning problems in which the subtraction problem arises. Previous chapter Next chapter RelatedDetails Published:2021eISBN:978-1-61197-648-9 https://doi.org/10.1137/1.9781611976489Book Series Name:ProceedingsBook Code:APOSC21Book Pages:ii + 158