17 Nov Environmental Awareness for AI Geeks
Selecting an Environment and Tools
I plan to take a few posts in this section to focus on expert systems (Giarratano 1989), exploring the development process in greater detail. While some projects require no development, some require you to select a platform or development environment or both. There are specially designed development platforms, tools and environments that provide much of the functionality required to build expert systems. They come in a few different forms, often tailored to specific types of processes, industries or solution spaces. Some provide enough generic domain functionality that we only need to design and build that which is specific to our domain and problem. For some intelligent processes, however, expert systems are not useful, so we need to select a lower level programming platform and environment.
The choice of a programming languages and tools (environmental awareness of a certain flavor) impacts the system performance optimization obtained through control, and the expandability gained by careful modeling (Gear 1969). Rules Engines, model frameworks and ontology management mechanisms facilitate the standardization of inference techniques and knowledge representation. For large projects and programs which need to communicate with other programs and incorporate knowledge “learned” by other programs, a carefully selected declarative language may be the best choice. This was why PROLOG was chosen for Japan’s ICOT “fifth-generation” project. For ICOT, however, the performance deficiencies of PROLOG became show-stoppers, so they transcribed into C Language all the things that had been written in Prolog. The purpose of Programming with Logic was, thus, completely defeated.
|Understanding Context Cross-Reference|
|Click on these Links to other posts and glossary/bibliography references|
|Prior Post||Next Post|
|Planning a Knowledge Project||Identifying and Acquiring Knowledge|
|expert system AI||Slagle 1963 Slagle 1971 Gear 1969|
|programming environment||Booch 1986 Desrochers 1987|
|Alpha Anywhere Mercury||Giarratano 1989 MacLennan 1987|
Back near the end of the fourth generation, the US Department of Defense selected ADA (Booch 1986), an explicitly parallel language, as the language for mission-critical systems programming. The selection was intended to guarantee interoperability and transportability of code segments, programs, and embedded systems. It took time for ADA to come of age, and it has yet to attract the attention of enough serious programmers to vindicate the effort as a commercial success. Current technologies, however, appear to supplant the need for explicitly concurrent programming, which requires complex coordination points and rendezvous. The industry has moved the parallelism into the machine (Desrochers 1987), first with multi-processor machines, then with multi-threading inside a single chip. The speed of today’s processors, combined with very smart technologies for dividing and conquering code, makes practical the advanced and complex processes once considered so time-consuming that parallel programming was necessary. This has paved the way for commodity programming languages (MacLennan 1987) to play in the stratosphere of high-performance. I believe parallel languages like ADA still have a place, but the demands on the architects and software engineers are so high that I think they will remain only in small niche domains.
TFS and Visual Studio
For .net developers, Microsoft Team Foundation Server with its integrated development environment (IDE), Visual Studio.net provide a complete suite of tools, or let developers choose their own favorites and plug them in. TFS is the software engineers’ extensible collaboration platform upon which Microsoft’s application lifecycle management (ALM) strategy is based. For shops that have significant investment in the Microsoft stack, Active Directory, SQLServer Databases, SSIS, SSAS, SSRS, SharePoint content management, or business solutions such as Dynamics, .net software development capabilities are often needed. TFS provides an IDE with hooks and latches into ALM governance tools including QA management, promotion management, build management and version control. Many of the best developers are cowboys and cowgirls, and TFS helps provide a bit of structure to the lifecycle of building new applications, integrations and services. This structure is needed to reduce risk of failure in software projects that have defined scope, schedule and budget.
What began life as a tool to help software developers write code in SmallTalk and other languages has evolved into a life unto itself, serving as the core interface for tools like Protégé and Be Informed Studio, as well as the most widely used IDE I know of. The suite of helpful tools, including build management and direct connections to Revision Control Systems such as GitHub and Apache Subversion make it a powerful place.
Besides build management, release management and revision control, there’s a lot more to line up when you want to develop applications, especially in an “enterprise” where integration with other systems, data and services is required. I’ll work on posting on some of the things I have learned in my roles as Enterprise Systems, Integration and Data Architect, and lay out some ideas about what you need at the outset to facilitate introducing knowledge systems into an ecosystem of existing information and data systems.
|Click below to look in each Understanding Context section|
|4||Perception and Cognition||5||Fuzzy Logic||6||Language and Dialog||7||Cybernetic Models|
|8||Apps and Processes||9||The End of Code||Glossary||Bibliography|