Your best bet would be to manually test your logic first on a controlled set of keywords. For example, make a campaign with a manageable number of keywords and over a period of X impressions or clicks and Y time intervals (say every hour/day/etc), manually optimize your bids (either lower/raise/leave as is) and record the results. Record your profit at each iteration and see if your logic works to produce an ideal campaign. A higher position may not equate to higher profit so your system should use profit as the deciding factor on its adjustments. If the adjustments you make work and can be programmed out, then get API access or CURL out the process and test it on the same campaign to see if your code does as well as you do manually.
The only thing to keep in mind is that there are a lot of other outside factors that can skew your results that is hard for you to control automatically. For example, a handful of new bidders could come in on a single day and cause you to make adjustments that will fry your campaign. Certain days (holidays, weekends, etc) may yield different conversions that if not accounted for could again cause your logic to nuke your campaign (ie: lower bids too low to maintain a decent CTR and burn your history). Also, you'll need to set limits/thresholds on your system - there are a lot of occasions where you may have to start bidding in a position where you are running at a loss while the networks algorithm optimizes you so programming around this can be both tricky/dangerous.
Lots of factors to consider outside of 'submit keyword, gradually lower bid.'