Sign in to follow this  
Followers 0
iic_hyd

ONE MILLI SECOND TIMER

14 posts in this topic

Hi, Can any on help me in making a one millisecond software timer in S7-300 or SLC

Share this post


Link to post
Share on other sites
I don't know the answer, but suggest that you'll probably get a quicker response in the Siemens forum

Share this post


Link to post
Share on other sites
Wait a minute, you said SLC. Well maybe I'm not fully understanding your question, but 1 ms is one of the avail timebases for the TON instruction, so you would just trigger a TON with .01s as the timebase, 1 as the preset, and then monitor the status of the .DN bit

Share this post


Link to post
Share on other sites
The SLC platform does not support 1 millisecond timers, but the MicroLogix and Compact/Conrol Logix platforms do. The .01 time base in the SLC platform, is 1/100th of a second, which would be 10 milliseconds.

Share this post


Link to post
Share on other sites
I thought the Micrologix only supported .01 time base also? Or atleast the 1100 and 1200

Share this post


Link to post
Share on other sites
The bottom dwelling ML1000 supports a 1 second and .01 second time base. The ML1100, 1200, and 1500 all have an additional millisecond time base and their free running clock in S:4 is in 100 microsecond slices - so with a little creative programming you could come up with a .1 millisecond timing resolution (however to expect that a program will react that fast is unrealistic) Edited by Alaric

Share this post


Link to post
Share on other sites
I almost never use the ML1000's so incorrectly assumed that they supported the same timer bases as the 1200 (which I use a lot).

Share this post


Link to post
Share on other sites
And I incorrectly assused that none of the Micrologixs could do it though about all I use is the 1100. But I usually set the time base at .01s. Seems like we have already been through this in another thread but what would be the use of such a timer? At best a typical program is going to be 3 to 5 ms.

Share this post


Link to post
Share on other sites
Same here. The ML1200 is hardly any more money but provides imensely more functionality.

Share this post


Link to post
Share on other sites
And are alll of you guys telling me that the scan rate of the PLC will not effect the timer? In my experience it does. especially when you are talking of 1ms. Although i am aware of special interuppt timers that can act independatly to the scan. (edit Sorry TW, missed you pointing that out)

Share this post


Link to post
Share on other sites
Not quite. The ML 1mS timer will remain accurate independent of the scan time as long as the timer instruction is scanned at least once every .256 seconds. The ability of the program to react with milliesecond precision however is dependent on the scan time. The ML processors are a bit faster than the SLC processors, but not fast enough that one can expect millisecond response, but you can expect better performance than the .01 second timer.

Share this post


Link to post
Share on other sites
It would be much more likely to get it to work with the STI functions. That has millisecond resolution and triggers off an interrupt. So as long as your code isn't too time consuming, even though it might interrupt the main program multiple times in a single scan, you could get it to work. At least on PLC 5's, I recall something like the "housekeeping" function of the PLC takes 3 ms minimum so obviously millisecond RESOLUTION may be possible even on a PLC-5 through the STI but not triggering an operation once per millisecond. I have seen code using the system registers on a PLC-5 to read the scan rate and dynamically update the PID loop instruction's "rate" information using that. I had all kinds of concerns about that sort of code so I usually always trigger PID's either via timer or via STI (preferably STI) because it's less complicated that way. So in summary, even on a PLC 5 it may be possible to achieve millisecond resolution but it would be impossible to achieve a response of once per millisecond, in spite of only having a 1/100th second timer. In answer to the question...the timers are updated using the scan time information that the PLC collects. So no, it won't DIRECTLY affect the timer. What affects a timer is that if you trigger off the /DN (done) bit, you're in for a surprise. The done bit is triggered once the timer interval is equal to or LARGER than the preset interval. This will cause scan time-influenced jitter. If you want to avoid this even with larger time bases, imagine for instance wanting to trigger some code once per second without scan-rate influenced jitter. So simply setting up a 1 second timer won't work. If you try it, you'll notice that the time interval is somewhat longer than 1 second (try doing 24 hour totalizers). The solution is to set up say a 0.01 resolution timer for 2 seconds. Then during each program scan, check if the accumulator exceeds a count of 100. If it does, subtract 100 from the accumulator and trigger the code that should be operating on a 1 second interval. It will have some jitter (sometimes slightly less than 1 second, sometimes more), and the jitter is scan time dependent. But you will trigger your code with an average RATE of once per second and the jitter will be less than your maximum scan time (in milliseconds). The STI function gives you the exact same functionality but you are limited to a single ladder and a single time interval. It has the advantage that it can trigger multiple times per main program scan, however. Jitter will be significantly less since about the only thing that can cause it is I/O. Anyways...that's my two cents on creative ways to handle time. Edited by paulengr

Share this post


Link to post
Share on other sites
Thats not really the issue. A timer keeps accurate time whether its in an STI or the main ladder or subroutine, or even if its called multiple times in one scan (you can do that btw). The instruction just needs to be executed at least once ever .256 seconds for .001 time base or once every 2.56 seconds for .01 and 1 time base. The issue has to do with the response time of your IO. A physical input can turn on at anytime in the PLC scan. The PLC program however does not "know" at that instant that the input is on. It will complete the program scan, at which time the input image table will be updated and the program wil then begin a new scan. Now, finally, the program "knows" that the input is on. Thus a timer triggered by that input has already missed several milliseconds of elapsed time. From then on it will keep accurate time. Since the purpose of a PLC is to control objects we are constrained by the hardware that interfaces with the world.

Share this post


Link to post
Share on other sites
The issue originally posted was 1 ms timers, regardless of hardware. Even at local buses and DC speeds, it's unusual to see I/O that can get to those speeds, very true. Usually timing even for a proximity switch is on the order of a few milliseconds. As to responding faster...that's not true. For instance, on AB PLC's, there's the IIM instruction which is specifically for directly accessing local I/O without waiting for the scan to complete. Similarly for reaction rates, there's the programmable interrupt (PII) counterpart of the STI which is useful again for ridiculously fast I/O response problems. I believe someone mentioned before on this forum that one of those issues is the cutter/sealer used in making lollipops.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0