We are trying to generate a sine signal from labview and send it to an arduino uno microcontroller through a usb cable so that we can monitor a sine wave in an oscilloscope but we haven't figured out how to use VISA Write to send the data into the serial port. Can anyone give us tips or advice on how to implement this project?
How can we send a sine signal from LabVIEW to Arduino using VISA Write?
Re: Updating single indicators within a clustor?
rkmadse wrote:So, I simplified what I was doing, removed the state machine, and now it works fine. Not sure what the hang up was, but it seemed like it was going into a wait state before it was finishing the update on the indicators. So I'll see if I can just stick with the simplicity and make it work. I have attached that type def if you want to look further into what I was doing wrong.
Image may be NSFW.
Clik here to view.
1. No need for that local variable inside of the case structure. The indicator will update with the next loop iteration thanks to the shift registers.
2. You had a weird state machine going. What you have here is A LOT easier to understand. You just now enqueue all of the states you want to go into and in what order. If you get an E-Stop condition, use the Enqueue At Opposite End to make sure that is caught immediately.
read in labview complex binary file written in matlab and vice versa
Dear All. We use the attached "write_complex_binary.m" funtion in matlab to write complex numbers into a binary file. The format used is the IEEE floating point with big-endian byte ordering. And use the attached "read_complex_binary.m" function to read back the complex numbers from the saved binary file. However, I just dont seem to be able to read the generated binary file in labview. I tried using the "Read from binary file" block with big-endian byte ordering with no success. I am sure its my lack of knowledge of how the labview block works is the reason. I also can't seem to find any helpful resources to that matter. I was hoping someone could kindly help with this or give me some directions to work with.
Thank you loads in advance. Please find attached the two matlab functions zipped. Kind regards.
Re: Find a capital letter in a string.
I think your teacher will laugh at that solution. Image may be NSFW.
Clik here to view.
Re: Timeout errors when using high speed camera.
spoonsso wrote:If anyone is still having this issue, I just fixed it on my setup (PCIe-1429 & Mikrotron 1362).
The trick was to set the the ROI w/ offsets and width/height in MC Control Tool and then update *only* the ROI width/height in MAX, leaving offsets set to 0.
Hope this helps someone out there.
Don't forget to give yourself credit and mark the thread as "solved" (by you!). Image may be NSFW.
Clik here to view.
Re: Find a capital letter in a string.
Mike...
Re: How can we send a sine signal from LabVIEW to Arduino using VISA Write?
Mike...
Re: Single Precision Float (SGL) not working as per IEEE754
But more to the point, why are you going to all the trouble with a loop and three different reformats. Just do the reverse of the original conversion.
Actually you can make it even easier.
1. Cast the SGL directly as an array of U8s.
2. To convert back, cast the array as an SGL.
3. Stick a fork in it, and call it done...
Flatten to String will also do the conversion, and Unflatten from String will change it back.
Mike...
Re: Stacked chart legend scrollbar unresponsive
Mike...
Error 1386 between Labview Quantum Design PPMS
(This is a different problem from http://forums.ni.com/t5/LabVIEW/Controlling-PPMS-from-Labview/m-p/2581885/highlight/true#M777361)
I also check the Quantum Design forum and webpage, but did not find a solution.
My initial guess was that there is something missing with the QDInstrument.dll file, but now I think there might actually be a problem related to the way Labview interacts with the .NET framework. and invoke nodes. For this reason I am posting to the NI Forum. Sorry for the long text, but I try to give a max of info.
I am trying to interact with the ppms from Quantum Design. I use the library (PpmsMultiVu.zip) attached to this message. It contains an example vi but it requires the QD software to run in parallel.
I get an error 1386 with this error message:
Error 1386 occurred at Invoke Node in OpenQDInstrument.vi->QDInstrument_Example.vi
Possible reason(s):
LabVIEW: The specified .NET class is not available in LabVIEW.
Interestingly, the error only happens when the QD library is installed on my C drive. It does not matter if it is the Labview2014>instr.lib folder, or anywhere else such as the desktop.
Also if I checked :
View > .NET Assemblies in Memory
It is empty.
If I install the library on a network drive, everything will run nicely, and .NET Assemblies in Memory has one entry
Also if I have two copies of the library, on on the c drive, and the other on the network drive, if I launch the example vi form the c drive, it will still give the error, event if the files are still located at the same place on the network drive.
The library makes heavy use of .net invoke nodes
Earlier a had an "Failed to connect to OLE server " error while working with this library, but this resolved by itself after a computer reboot, without a logical explanation.
Thanks for your input
Re: Error 1386 between Labview Quantum Design PPMS
The library in question
Re: Single Precision Float (SGL) not working as per IEEE754
Your snippet works just fine for me.
Image may be NSFW.
Clik here to view.
PS: You don't need the SGL bullet in the upper picture. Just wire a constant to the type input of the conversion and set its representation to single float. Also the %s on the upper typecast is unnecessary as typecast by default sets things to strings.
Re: Single Precision Float (SGL) not working as per IEEE754
I agree with the others that you need to clean up your code, because it is just way too convoluted. Your "bytes Input" should probably be U8.
To demostrate the concept, you don't even use formatted strings, because (especially with SGL, you might lose significant information detouring over a decimally formatted intermediary representation of the binary values.)
Your use of a string containing %s for the upper typecast input is just silly. Since only the type is important and the value is irrelevant, you should probably wire an empty string (or leave it unwired, since string is the default). Typing a formatting statement in there does not change things, but might really confuse somebody looking at the code.
You can also avoide your "toSGL" if you wire a SGL constant to the default input.
Image may be NSFW.
Clik here to view.
All you probably need is a hexadecimally formatted string contro/indicator a SBL contr/indicator and a typecase operation. eliminate the middleman!
Image may be NSFW.
Clik here to view.
(Of course you can easiy go from SGL to string and back. Same difference)
Re: read in labview complex binary file written in matlab and vice versa
Be a Scientist -- do an Experiment.
I presume you know Matlab and can generate a little complex data and use the Matlab function to write it to a file. You can also look at the Matlab function that you posted -- you'll see that Matlab takes the array of complex apart into 2-D array of [real, imaginary], which are written as 32-bit floats, which LabVIEW calls "Sgl".
So now you know you need to read an array of Sgls, and figure out how to put it together again into an array of Complex.
When I did this experiment, I made the Real part of the (Matlab) Complex data [1, 2, 3, 4], and the Imaginary [5, 6, 7, 8]. If you are curious, you can write these data out in Matlab with your Complex Write function, then read them back as a simple Array of Dbl to see how they are ordered (there are two possibilities -- [1, 2, 3, 4, 5, 6, 7, 8], if written "All Reals, All Imaginaries", or [1, 5, 2, 6, 3, 7, 4, 8], if "Real, Imaginary pairs"].
Now you know (from the Matlab function) that the data are an array of Sgl (to LabVIEW). I presume you know how to write the three-function routine that will open the file, read the entire file into an Array of Sgl, and close the file. Do this experiment and see if you see meaningful numbers. The "catch" is the byte-ordering of the data -- does Matlab use the same byte ordering as LabVIEW? [Hint -- if you see numbers from 1 to 8 in either of the above orders, you have the byte-ordering correct, and if not, try another byte ordering for the LabVIEW Read Binary function].
OK, now you have your Sgl Array of 8 numbers, and want to convert it to an array of 4 Complex, [1+5i, 2+6i, 3+7i, 4+i8]. Once you have figured out how to do this, your problem is solved.
To help yourself when you actually go to use this code, write it as a sub-VI whose input is the Path to the file you want to read and whose output is the Array of CSG contained in the file. My LabVIEW routine had 8 LabVIEW functions -- three for File I/O, and 5 for converting the 1D Array of Sgl into a 1D Array of CSG. No loops were needed. Give it a try -- you can test it against the Matlab Data File you used for your Experiment (see above) and if you get the expected answer, you've written correct code.
Bob Schor
Re: read in labview complex binary file written in matlab and vice versa
Thank you Bob. I have been doing exactly what you have been saying since yesterday. Except the "CATCH". I did not fiddle with the byte ordering. Now interestingly the byte ordering is all upside down. Kind regards.
Re: xl sheet table column and row header
HI,
how to merge the cell.i try to this image format but return the error .i thik i given input is wroung so pls tell me how to give input to merge property node.
Re: xl sheet table column and row header
Image may be NSFW.
Clik here to view.image
Re: Shared variable not reliable? Problems syncing multiple machines
In my experience SVs have generally been pretty reliable, except when they aren't.
The most interesting solution to a SV failing to update I've found wasn't in code, but by running the Distributed System Manager. With the DSM running and the libraries with the problematic variables expanded, the variables updated without a hitch. If the DSM was closed or the libraries not expanded, the code would be writing to the variable, but the updated value would never get read on other machines (cRIOs). Very strange.
Have you tried disabling firewalls on both the server and client PCs?
Re: Linux RT Keyboard
Have you enabled the embedded UI on the target (done in MAX), and tried running your application deployed on the cRIO? From what I recall any inputs won't work if you're running the VI from the LabVIEW development environment (the Remotely Controlled watermark is in the corner).
When you plug a display into the cRIO to view the embedded UI, can you open up a terminal and type any text at all?
This embedded UI example may prove useful in testing - https://decibel.ni.com/content/docs/DOC-36546
Re: Shared variable not reliable? Problems syncing multiple machines
Thanks all for your answers so far!
PatrickLye wrote:
- Are all of the machines on the same subnet?
- Do you have good gigabit connections on all machines?
- Download WireShark and look at your network traffic to see if you are having packet crashes.
- Try Putting the machines all on a separate switch that is their own network for testing and see if the problems go away. If it does you may be able to set up a virtual network in the switch for these machines and then only allow traffic in/out of the network that is needed.
- Can you turn off the firewalls for testing?
- Try turning off logging in your network variable properties.
- Turn on buffering and set up a buffer size that will work on your network variable properties.
- Disable the enable enhanced DSC run time support in the Advanced tab of the build executable.
- I’ve found shared variables to be incredibly dependable on my industrial network and a bit problematic on our office network. They need to have good connectivity, and our office network isn’t all that good. They are working on it.
1. Yes
2. It's gigabit, bit if good or not - I don't know. They are all on the same switch. But some other traffic is also there (remote access, database connection...) and the IPs are accessible in the whole company (~3000 employees in 4 countries).
3. I tried wireshark in the past, but I don't know how to use it properly to detect network issues. Maybe I can get some support here.
4. We would need an USB->Ethernet adapter, because there is no free ethernet port on the machines. Because we don't really trust these USB devices, we didn't try yet. We'll try it in the next days!
5. Firewall already is turned off, antivirus has been disabled
6. I don't have an "Logging" section. Means I don't have DSC installed?
7. Will buffering improve the "reliability"? If we just get old buffered values, the alive-bit also might be missing.
8. We don't have any EXE running yet. All stations running in IDE. Is it a disadvantage?
---
I read about a logos.ini, which can be used in LV 8 to restrict shared variables to a specific MAC. Is this also possible in LV 2014?
/EDIT: Is seems that the problems are not totally random. The line might run without any issue for some hours. Once the problems start, they will occur frequently. Closing LabView and restarting it on all stations seem to clear the issues for some time. Maybe the SVE will run into problems after some time? Originally we just jused Read/Write shared variable with a connection-string wired to it. Then we started to open the SVs once and then use the refnums - but it did not really improve the situation. But like I said, the read/write VIs do not throw an error - the values are somehow just not equal on both sides.
Is it possible that the network (switch?) can run into problems after some time?