As of December 1, 2020, Focal Point is retired and repurposed as a reference repository. We value the wealth of knowledge that's been shared here over the years. You'll continue to have access to this treasure trove of knowledge, for search purposes only.
Join the TIBCO Community TIBCO Community is a collaborative space for users to share knowledge and support one another in making the best use of TIBCO products and services. There are several TIBCO WebFOCUS resources in the community.
From the Home page, select Predict: WebFOCUS to view articles, questions, and trending articles.
Select Products from the top navigation bar, scroll, and then select the TIBCO WebFOCUS product page to view product overview, articles, and discussions.
Request access to the private WebFOCUS User Group (login required) to network with fellow members.
Former myibi community members should have received an email on 8/3/22 to activate their user accounts to join the community. Check your Spam folder for the email. Please get in touch with us at community@tibco.com for further assistance. Reference the community FAQ to learn more about the community.
The original post (by the originator of the post) is the only one that has the option of adding [SOLVED] to the heading to let everyone knows it's closed (although they can still respond if they have something to add or ask)
Or the caretakers of this forum can change it for you, if they deem it appropriate.
It's nice to give credit to the forum (and the one who solved your problem) when you find an answer. So nice...
I wish I had been in on this thread when you were talking about +0 and -0, because it reminded me of normalizing mantissas in floating point numbers, which only works if there at least one bit that's set to "1" ! How do you normalize true '0' ? You can't ! There's no bit set in zero.
Someone here in this thread said that you can do it in assembler, by setting the sign bit and nothing else, but then 2's complement confounds the issue.
Who invented 'zero' anyway ?
OK, so it is without a doubt the most important number there is, but it sure causes a lot of trouble for something so small !This message has been edited. Last edited by: Charlz,
Nubi I think this is the first time I see someone from the UK saying this.... Now the only thing We (the Europeans) would like Them (the UK) to do is, introducing EURO, driving on the proper site of the road, using the metric system ...and some other things I have forgotten for now....LOL
Frank
prod: WF 7.6.10 platform Windows, databases: msSQL2000, msSQL2005, RMS, Oracle, Sybase,IE7 test: WF 7.6.10 on the same platform and databases,IE7
Posts: 2387 | Location: Amsterdam, the Netherlands | Registered: December 03, 2006
Right, Nubi, it was the mathematicians from India who thought of '0' as having meaning as a 'placeholder' to distinguish '5' from '5000'.
x + z = x, and x - z = x, for all x. Simple, but who thought of formally stating that, or even that it needed to be said ?
Apparently the Romans thought that 'nothing was nothing' and therefore gave it no symbol, which may have held them back almost as much as the lead in their ceramic dinnerware (which may be why they never thought of using '0' in the numbering system in the first palce ?)
Roman numerals are fascinating, but can you imagine doing even simple math with it, much less calculus ?
Maybe it seemed to the Romans like "Much ado about nothing," but I'm sure Pascal or Descartes (who invented the Cartesian join ;>p) would disagree.
Without the '0', where would we be today ? How would it effect your paycheck if they simply left out the '0's ?