Deep Web crawling refers to the problem of traversing the collection of pages in a deep Web site, which are dynamically generated in response to a particular query that is submitted …
Metadata—the machine-readable descriptions of the data—are increasingly seen as crucial for describing the vast array of biomedical datasets that are currently being deposited in …
Extracting structured information from templatic documents is an important problem with the potential to automate many real-world business workflows such as payment, procurement …
Automated test generation for web forms has been a longstanding challenge, exacerbated by the intrinsic human-centric design of forms and their complex, device-agnostic structures …
Users frequently interact with software systems through data entry forms. However, form filling is time-consuming and error-prone. Although several techniques have been proposed …
HM Jamil, K Naha - Proceedings of the 38th ACM/SIGAPP Symposium …, 2023 - dl.acm.org
The emergence of Alexa and Siri, and more recently, OpenAI's Chat-GPT, raises the question whether ad hoc biological queries can also be computed without end-users' active …
In biomedicine, high-quality metadata are crucial for finding experimental datasets, for understanding how experiments were performed, and for reproducing those experiments …
G Fuchs, H Roitman, M Mandelbrod - Proceedings of the 44th …, 2021 - dl.acm.org
Digital-forms are commonly used for collecting structured information from users. However, filling digital-forms that include a large number of fields is tedious and error-prone. Auto …
Web forms are massively used as a very effective way for user interaction with information systems. Notwithstanding, filling in forms with personal data can be tedious and repetitive …