Make a robots.txt and drop it in your root directory.
It basically tells the search engines to scan every directory ( / ) on your website every week. The search engines will open your robots.txt and know exactly what to do when it finds new pages it didnt know about before. When it finds a new page, it gives it a SERP Boost of around +50 which tells the search engines to rank it higher. I recently added this exact robots.txt to a website of mine that was getting around 4,200 uniques a day. After adding it, my traffic increased to just under 7,000 uniques per day from all the extra traffic I started pulling from Google and Bing.
Code to put inside of your robots.txt:
Hope this helps!