Question
Extracting table nested within div of HTML
I am trying to extract some machine data from one of our servers. Server gives me file format HTML, end goal is to extract the table data and export to CSV to work with further.
I am having a difficult time actually extracting the table data. I am a ME and by no means a programmer but can make some basic python scripts.
I have spent 2 days now researching html tags, how to parse with BeautifulSoup etc. But it's time to raise my hand and ask for some help.
The table data seems to be nested within a div, which is either a sibling or child of style. If I open the html raw with notepad it appears that div is a sibling to style. Open it with notepad++ and it looks to be a child to style
However, if I pass html through BeautifulSoup and soup.prettify() the seems to be at a parent level indentation within . (Sorry I may not be describing this correctly)
There are no classes to gather, so I have been using soup.find_all('tag', class_=None) to aid with finding the tags
from bs4 import BeautifulSoup
from bs4 import SoupStrainer
If I pass the html through BS and prettify, I can see the div is shown at the highest indentation.
'html ="<style> .text { mso-number-format:\@; } </script> <table><tr><td colspan=10 style='font-weight:800;font-size:x-large;text-align:center;'>HISTORICAL SEARCH</td></tr><table><br/>
<div>
<table cellspacing="0" rules="all" border="1" style="border-collapse:collapse;">
<tr>
<th scope="col">PLANT</th><th scope="col">SHIFT_DATE</th><th scope="col">MACHINE</th><th scope="col">PN</th><th scope="col">TOOL</th><th scope="col">PR ID</th><th scope="col">OPERATOR</th><th scope="col">SHIFT</th><th scope="col">START_TIME</th><th scope="col">END_TIME</th><th scope="col">CAVITIES CURR</th><th scope="col">CAVITIES IDEAL</th><th scope="col">GOOD PARTS</th><th scope="col">SCRAP PARTS</th><th scope="col">TOTAL</th><th scope="col">GOOD_CONTNRS</th><th scope="col">SCRAP_CONTNRS</th><th scope="col">SCRAP_COST</th><th scope="col">FTQ</th><th scope="col">MACH_CYCLES</th><th scope="col">RUN TIME</th><th scope="col">DOWN TIME</th><th scope="col">CO TIME</th><th scope="col">IDLE TIME</th><th scope="col">CYCLE IDEAL</th><th scope="col">CYCLE_AVG</th><th scope="col">CALC TEEP</th><th scope="col">CALC OEE</th><th scope="col">MSPEC</th><th scope="col">LAST MODIFIED</th>
</tr><tr>
<td>0000</td><td>07/21/2024</td><td>0007</td><td>PN123456</td><td>DIE NUMBER 1</td><td>307892</td><td>0</td><td>PM</td><td>7/21/2024 3:00:00 PM</td><td>7/21/2024 11:00:00 PM</td><td>1</td><td>1</td><td>0</td><td>0</td><td>0</td><td>0</td><td>0</td><td>0</td><td>0</td><td>0</td><td>0</td><td>479</td><td>0</td><td>0</td><td>1200</td><td>0</td><td>0</td><td>0</td><td>M7602L48</td><td>7/21/2024 11:02:47 PM</td>
</tr><tr>
<td>0000</td><td>07/21/2024</td><td>0007</td><td>PN123456</td><td>DIE NUMBER 1</td><td>307892</td><td>0</td><td>AM</td><td>7/21/2024 7:00:00 AM</td><td>7/21/2024 3:00:00 PM</td><td>1</td><td>1</td><td>0</td><td>0</td><td>0</td><td>0</td><td>0</td><td>0</td><td>0</td><td>0</td><td>0</td><td>479</td><td>0</td><td>0</td><td>1200</td><td>0</td><td>0</td><td>0</td><td>M7602L48</td><td>7/21/2024 3:02:46 PM</td>
</tr>
</table>
</div><table><tr><td colspan=10></td></tr><table>"
soup = BeautifulSoup(open(html), 'lxml')
print(soup.prettify())
If I try and search for the table directly I am left with a empty list.
soup.find_all('table')
#[]
If I try and search for the div I also get an empty list.
soup.find_all('div')
#[]
If I try and search for style I can see the with table. `soup.find_all('style')
I also attempted use SoupStrainer to only parse for the div
only_div = SoupStrainer('div', {'class': None})
soup = BeautifulSoup(open(html), 'lxml', parse_only=only_div)
print(soup)
#empty
With my limited knowledge, I am unsure where to proceed from here and looking for help/advice.
My best guess is that assuming div is a child of style, I need to do search within style within div for table. But I am not exactly sure if div is even a child or not.
Apologies in advance for any formatting issues, this is my first post on SO.